As a side assignment this quarter we are supposed to “notice” the little UI details that are almost designed to be unseen, and built into everyday digital interfaces more to affect us in a subconscious or intuitive way.
Take the iPhone iOS for example: The subtle shading behind the apps on your iPhone’s home screen and the animated movement from one home screen to the next, with the ever so slight slowing of the movement as it reaches its next anchor point. Or the fun way is bounces back if you don’t slide it far enough.
Wanting to test just how “real” this little feature was, one of my first inclinations was to see if I could get the phone to perfectly balance between two home screens. Slowly pulling the icons to the left and gently lifting my finger off the touch screen, as if the movement of my finger pulling away from the screen might cause some turbulent air current and knock the home screen, carefully positioned between the two “desktops,” off balance and two one side or the other.
I have on numerous occasions sat there trying again & again to get it balanced just right.
But of course my practiced approach to trying to apply physical tactics to this digital interface never got any different results. It was all up to the coded program to decide which direction the screen would “fall,” all based on defined input. “If position of home screen (1) is at pixel 319, slide to the right. If position = pxl320, slide right. If position = pxl321, slide left.” And so on.
But I still playfully try to get the screen to balance perfectly in-between. Perhaps thinking I can fool the computer. Perhaps trying to champion the digital word, getting it JUST right, causing the the program to freeze up out of pure incompatibility!
“Ha!” I would shout.No Comments »