Sketching, Storyboarding, and Critique

From CS160 Spring 2013
Jump to: navigation, search

Contents

Readings

Optional

Reading Responses

Elizabeth Hartoog - 1/28/2013 19:27:14

The example I would use was back when they first started introducing those touchless toilet flushers. You would sit down, do your thing, and when you got up it would flush the toilet automatically. One time I get up and it doesn't flush. I'm not really clear what to do at this point. I had been trained to kick the thing with your foot on the flusher. I flailed my hand in front of what I assumed was some kind of detector and nothing happened. I finally noticed that on the top of the whole contraption there was a black button, that blended in with the black plastic and that this was probably in fact what I needed to push. The issues with that was at the time, this toilet didn't fit my mental model of how to use a toilet. There was no handle on the side of the toilet and there wasn't a long flusher sticking out from the side. The other problem was this thing had extremely poor visibility. If it's on black plastic, the button shouldn't be black as well. I also had no for sure idea what that button would do. Sure, I assumed it would flush the toilet since what else would a button be doing on a toilet, but there was no icon or text to indicate that was definitely the case and this was a fancy new electronic toilet. Needless to say, I just spent way too long talking about toilets. The problems I had with this toilet are mostly fixed. For example on campus we have those toilets but they also have bright red buttons so they're easy to spot. They still don't have some kind of label to identify them, but at this point it's much more likely that the button fits our mental model.

A difference in affordance for physical devices is that the designer has a third dimension to convey what and how his device should be used for. In the software world, that is not necessarily the case. To compensate we will often draw things that look 3D to convey the affordances. Do we want to have a dial that is turned? Physically we would create a small dial to turn, but in software you'd need to design something compatible with mouse movement that conveys the same concept which in this case would be a sliding bar. Norman mentions the perceived versus actual affordances. Everything we look at we have a perceived affordance. I think that chair should be used to be sat on. That bookshelf should be used to store books. But maybe that chair is actually for decoration to place light expensive dolls on and couldn't carry weight. Or maybe that bookshelf is actually a secret entrance to a dungeon. Those would be the actual affordances.

In the physical world we are also allowed the use of all of our senses to come to a conclusion of the affordance of objects. I can pick up a tricycle and spin its wheels and watch the peddles turn with it to try and figure out how it works. I can feel the metal pieces attached to the plastic seat and handlebars. I can smell the oil in the gears and the metallic frame. I can hear the peddles move the chain. And if I am a stupid child, I can lick the handlebars to see what it tastes like. As far as software, there is no smelling, licking or touching of the different parts of the software. All our assumptions of the affordances of software comes only from our visual and hearing senses. Though in the android platform there is vibration which adds the element of touch, though not in the same way.

Jeffery Butler - 1/28/2013 22:05:02

1) A very simple device that seemed to intimidate everyone at a startup company I worked at over the summer was a door with two pull handles-one on either side of the door. In addition to the door having two pull handles, the handles were shaped parallel to the door and there was glass on either side of the door subsequently giving user’s a difficult time on understanding which side of the door to pull/push from. This poor designed door violated the design principle of visibility. This door would have been more efficiently designed if 1. There was a pull handle on one side of the door and a plate on the other symbolizing which side to push and the other to pull. 2. Placing the plate on one edge of the door and the other side with the pull handle to be warped implying which side of the user should pull from. 3. Which was a solution I implemented: write a sign on the door telling users to either push or pull depending on the side. However, solution number 3 ruins the implied aspect of an affordance. 2) Affordances give implicit clues to its user on how to properly operate things. There are many differences between software and actual physical affordances. In software, the designers have to work with user’s acting through a layer of abstraction on intangible operators in a two dimensional plane. Therefore, how do you represent a forward or backwards button or other implied affordances? Also, where do you put this option (mapping)? And how to do you tell if the user is actually operating after clicking a button (feedback)? In a physical affordance a person can simply walk back to where they were before or feel a knob that turns within their grip. Perceived accordance in relation to software becomes especially tricky with situations where say a button on a web browser might be labeled back but the user could think that button relates to going back to their desktop not necessarily the previous webpage. Therefore, perceived affordances and actual affordances become grow in complexity when the user when the affordance is less tangible due to amount of abstractions.


Timothy Ko - 1/28/2013 23:48:17

In my car, there is a compartment between the driver’s seat and front passenger’s seat. This compartment is split into bottom and top sub-compartments. There are two switches on the lid of the whole compartment that you can pull, but they are arranged side to side. I can’t even remember now which switch opens which part of the compartment. Does the left switch open up the top compartment, or does the right? The same problem arises for how to open the bottom compartment.

This compartment device violates the principle of good, natural mapping. There is no natural mapping between left/right and top/bottom. There may be cultural ones, but that kind of design isn’t general enough. To redesign the device, I would arrange the switches with one on top of the other, which would produce a more natural mapping. They wouldn’t even have to be 1-to-1 mappings, with the bottom switch being exactly where the compartment divides into top and bottom; as long as the relative locations between the two switches was clear, that would be all I needed.

With physical devices the relation between the object and physical action is direct. If a physical button affords pushing, you physically push the button. However, for the virtual interface, it is the virtual action, not the physical one, that users think of when thinking of affordances. For example, a user will think that a scroll down bar affords dragging the bar up and down, not that it affords moving the mouse up and down while holding the left mouse button. This is fortunate, as the very same physical act on the mouse could be used for a number of other functions, like dragging a desktop icon.

With this in mind, what Norman means by “perceived” affordances are actions that a user perceives a device to provoke or hint at. What he means by “actual” affordances are the physical actions required to carry out the perceived actions. For physical devices, these happen to be the same thing; for software user interfaces, they aren’t necessarily the same.


Colin Chang - 1/29/2013 0:18:57

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

My bathroom sink comes to mind. My sink handle has an appearance you might recognize: a protruding handle centered on a spherical base. Such an appearance, from past experience with similarly shaped sink handles, suggests the following operation/function mappings: up (directly up) means faucet on with lukewarm heated water, up/left means faucet on with hot water, and up/right means faucet on with cold water, and down means faucet off.

Unexpectedly, the following mapping is the actual case: up (directly up) means faucet on with cold water, everything else means faucet off.

Of what was discussed in the first chapter, this is likely a failure of affordances. While the physical shape of the faucet handle suggested methodology of operation, the actuality was different.

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

One of the main differences between affordances of physical devices versus affordances of software user interfaces is that in software there are no actual affordances (though, if we expand our consideration beyond the software itself, computer hardware like mouse and keyboard can be considered as actual affordances). In this context (I assume we mean the context of software user interfaces), the perceived affordances are what is seen on the screen. Perceived affordances are often carried from the physical world to the software world, mimicking form and function (e.g. consider the similarities between physical buttons and virtual buttons). Additionally, exclusively digital affordances have arisen as a consequent of technology's pervasiveness (e.g. consider the "tail-eating" arrow meaning "reload" on your browser or even the idea of circular motion meaning "loading" (e.g. Apple's pinwheel)).

Lauren Fratamico - 1/29/2013 0:29:55

I was using a shower where the mechanism to change from bath tub mode to shower mode (so the water will come out the top) was a circular surface below the faucet that you had to pull down. Unfortunately it blended in with the design and it was non obvious that there was something there to control where the water was directed to. This violates Norman's affordance, consistency, and visibility principals of design. Since it was (maybe?) more visually appealing to not have a stopper to maneuver up or down above, I would redesign it so that it had more of a groove above the circular surface (near where the water comes out of the faucet) to make it more obvious that there is a moving part there.

The author claims that "when simple tings need pictures ... the design has failed." This is one of the major differences between affordances of physical devices and software UIs. UIs often need pictures to describe the functionality because otherwise it is just a flat screen. For example, computers. As Bjoern mentioned on the first day of class, there are icons on a computer designed to look like items that once were in offices. These /are/ pictures; however, I would argue that they are necessary for describing what they are used for. In this case the object has no "actual" affordance, as you only perceive an icon of a trash can as affording scraped items - the icon itself could not afford that. So I believe that in this context, Norman would mean "actual affordance" to be what the object actually does virtually. Does it look like a trash can and serve as a place to store trashed documents? Then in this case it would be both a "perceived" and "actual" affordance.

Brent Batas - 1/29/2013 11:55:55

Question 1)

My Chronos chess clock (pictured here: E301-2.jpg)

The clock has some 30 settings, but I’ve really only used maybe 5 of them. The clock is very easy to use when it is set up: just push the button on your side after you make your move. The problem is initially setting it up with your desired time settings.

One problem is that the unnatural mappings between settings and controls. In order to change the time settings, you need to press arbitrary button combinations. For instance, if you want to set up the clock for a “blitz” game (5 minutes each), you need to press your finger down on the two metal circular plates, then while holding those down, press the middle square button. Then remove your fingers from the two metal plates, and press the middle square button one more time. Then press the metal circular plate on the right hand side. The only way to figure this out is for 1) someone to tell you, or 2) read through the 48 page manual, or 3) spend half an hour guessing and checking. What’s worse: let’s say you want to play again the next day. You’ll need to set up the clock again, and that requires you to remember what buttons you hit.

I would redesign the clock with a panel on the bottom that has dedicated controls for adjusting the time, toggling 5 second delay, toggling beep noise, toggling lights, and other specific settings. It would also have dedicated controls for saving and switching to certain settings, kind of like how you save your favorite radio stations. This panel would have a cover, so that overall the clock would still maintain its current beautiful appearance.

Another principle this device violates is the use of metal circular plates instead of buttons. These plates replace the traditional buttons found in most chess clocks. While the plates are a novelty, they are confusing because they don’t actually “press down.” This is especially confusing in the quite common use case where you capture an opponent’s piece, and with that piece, you try to press the clock. This is quite common in blitz games, where you save 1 or 2 valuable seconds by being able to capture a piece and press the clock in a fluid motion. Normal “button” clocks all support this, but the fancy heat-sensor plates don’t, since the pieces do not emit heat.

I would redesign the clock using actual buttons instead of the plates, because buttons are a common affordance that people know to push without any explanation needed. Buttons would support the use case I described above. Also, you get the physical feedback of the button’s underlying spring pushing up against your finger, and when you force that spring to compress, you *know* you’ve pushed a button. You can hear the button hit the plastic, and overall there is a clear sense of causation. The plates try to simulate this by having a digitally produced tone and an LED that fires off when you touch the plate, but having used chess clocks extensively and talked to many other chess players, the feedback of the physical buttons is just so much more natural and satisfying. There seems no good reason why the heat plates are used, other than to be fancy.

Question 2)

The concept of affordances applies to software user interfaces as well as physical devices; for instance, a button graphic affords pushing (clicking). However, computers are prone to spawning perceived affordances, which are actually bogus. Norman talks about how if you touch a computer just before it fails, you would be led to believe that you caused it to fail. Other examples are if your computer is lagging, you might see no result from pressing a button and conclude that your action did nothing. So you press it again and again, and eventually the computer (once it unfreezes) will behave as if you pushed the button many times, and in doing so, probably doing something unwanted. Another example I’ve come up with is someone who always goes to Yahoo.com before going to any other webpage. If Yahoo is up, then they will probably be able to get to another webpage successfully. If Yahoo is down, then they will probably not be able to get to the other webpage successfully. So this person would have the perception that visiting Yahoo.com enables access to all other internet sites.

Software has bugs and is flaky, so it can seem nondeterministic to a user, which creates these perceived affordances which may be different than actual affordances.


Soo Hyoung Cheong - 1/29/2013 12:39:38

A badly designed device that I had to use daily is the elevator in my apartment complex. Among Norman’s design principles, it violates the Principle of feedback. First of all, there is no signal telling where the elevator is currently located, and once you press the button, you have no idea of the estimate of how long it will take to for the elevator to come once you press the button. Also when you get on the elevator, you might mistakenly have pressed a wrong button, but you will not be able to know until you actually get to the destination floor, since there is nothing that indicates where the elevator is headed. Also, it violates the Principle of Mapping in that you would expect the elevator to go to the floor you desire to go, however at times, if another person outside presses button to call for the elevator, the elevator goes to the floor it is being called instead of the original destination that it is expected to go to. I would re-design the elevator by having a program to make sure that the elevator door is opened once before moving onto different request, and also have a set of LED lights to at least show where the elevator is currently at.

There is not much of a difference in affordances of physical devices versus affordances of software user interfaces. Obviously, the physical devices can have properties that do things as it seems like it does, and the software user interfaces also have certain buttons like the “minimize,” “maximize,” and “exit” symbol in corner of the program that has pre-conceived perception to do what we think it does. “Perceived” affordances are the properties that an object seems to hold just from its looks. On the other hand, “actual” affordances are the real properties of an object in terms of what it does and/or what it is made out of, etc.

Marco Grigolo - 1/29/2013 12:41:17

1) My microwave. While it has a good mapping (buttons clearly express power, time, and even automized function such as warm up poultry or heat milk), and by having one button per function allow a good mapping (pictures for a chicken for the poultry button, waves for powert etc), the conceptual model and feedback was just terrible. Not the screen nor the buttons explain the right procedure to set power and time. I could not enter the power in terms of watt (like 100 for 100 watts, since it accepted only 2 digits to be entered, I could not know what was the relationship between the digit entered and the actual power set), I did not know if I had to enter first power and then time, and if, after pressing the time button to set the time, the power I just entered was memorized. Even the automated buttons did not tell me if it accounted for the recipient weight when heating or not, and the fact I bought it second hand, without instruction manual just forced me to use with setting only the time and press start.

2) In a physical device, the affordances are more about giving the right clue to the user, since the feedback can not be misinterpreted. In software you also need to account for the feedback, since a wrong feedback or a delay in the feedback might give the wrong clue to the user about their action. In this context Norman mention perceived and actual affordances as the relation that exists between how an object (or a software instruction in the UI) looks and what this shape/look naturally suggest to the user in term of how to use that object. If this link is natural, then the UI or the physical object will be easy to use.

Michael Flater - 1/29/2013 13:03:31

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

The obvious answer here is remote controllers for your television, your stereo, your DVD player, and the dreaded universal remote. While design of such a vast array of devices are undoubtedly hard to make intuitive, there are some things we can do to improve upon them. The industry has very little standards. By this I mean that some remotes have one button to turn the device on and off, others actually have an on button and an off button for which there is no logical explanation. I actually have a universal remote with two on buttons. The use of this second on button is to control just the cable box, while the other on button controls whichever other device you have highlighted on the top of the remote. This actually violates both principals of design, it is not a good mapping (which on operates which device?) and the visibility is only available if all devices are at the right state (I cannot tell if the cable box turned on unless the TV is already on, or unless the little light pops on but that only happens after a short boot-up.) To solve this problem, which is not easy, we can adopt some industry standards. Simple ones like: only one on/off button, having the remote clearly show which device it is set to control (by highlighting the correct device symbol on the top of the remote), having the remote highlight which controls actually do something on this device by actually illuminating it with LEDs (the play button doesn't do anything when you are on the TV setting, so why have it look like all the other buttons?)

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

No, since we try to relate our software to physical, real-world things, we can always draw connections between the "materials" and the perceived use. An easy example is on the top of almost every word processor, the cut/paste icons. They are a picture of scissors and glue. This is intuitive because of our perceived notion of both what it means to cut and to paste. Another example is the tabs at the top of your web browser. This is easily perceived to be like a file cabinet. In this context, the perceived versus actual affordances map themselves much like they do in the real world; scissors cut, glue pastes and files contain different information.

Michael Flater - 1/29/2013 13:04:24

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

The obvious answer here is remote controllers for your television, your stereo, your DVD player, and the dreaded universal remote. While design of such a vast array of devices are undoubtedly hard to make intuitive, there are some things we can do to improve upon them. The industry has very little standards. By this I mean that some remotes have one button to turn the device on and off, others actually have an on button and an off button for which there is no logical explanation. I actually have a universal remote with two on buttons. The use of this second on button is to control just the cable box, while the other on button controls whichever other device you have highlighted on the top of the remote. This actually violates both principals of design, it is not a good mapping (which on operates which device?) and the visibility is only available if all devices are at the right state (I cannot tell if the cable box turned on unless the TV is already on, or unless the little light pops on but that only happens after a short boot-up.) To solve this problem, which is not easy, we can adopt some industry standards. Simple ones like: only one on/off button, having the remote clearly show which device it is set to control (by highlighting the correct device symbol on the top of the remote), having the remote highlight which controls actually do something on this device by actually illuminating it with LEDs (the play button doesn't do anything when you are on the TV setting, so why have it look like all the other buttons?)

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

No, since we try to relate our software to physical, real-world things, we can always draw connections between the "materials" and the perceived use. An easy example is on the top of almost every word processor, the cut/paste icons. They are a picture of scissors and glue. This is intuitive because of our perceived notion of both what it means to cut and to paste. Another example is the tabs at the top of your web browser. This is easily perceived to be like a file cabinet. In this context, the perceived versus actual affordances map themselves much like they do in the real world; scissors cut, glue pastes and files contain different information.

Tiffany Lee - 1/29/2013 16:49:18

1) My family's washing machine has one knob for picking the kind of wash and for starting the cycle. To pick the kind of wash, you turn the knob to the desired option. To start the wash, you pull that same knob. The first time I tried to do the laundry I could not figure out how to start it. This device violated Norman's design principle of making things visible. It was not visible that to start the cycle you had to pull the knob. The knob to me was just to pick the kind of wash and nothing else. It also violated the principle of natural mapping. The knob looked like it was suppose to be turned rather than pulled. To re-design it, I would make a separate knob that would look like it was for pushing and pulling for starting the wash.

2) I do not think there is a difference because both require physical actions. For software user interfaces, you still have to move your mouse or in the case of touch devices, use your finger to interact physically with the device. Perceived affordances are what we think will happen when we make a certain action and actual affordances are what actually happens when we make a certain action.

Eric Xiao - 1/29/2013 16:59:15

1) When I was about to write notes after reading the midterm hint, I had just bought a new three subject notebook from Walgreens, and I am now going to return it.

Now that I am back from the store, let me tell you what is wrong with it. The top surface had lots of patterns on it, but no part of it where black or blue pen or even marker would be easily visible in order to label the notebook. The tops and bottoms are flimsy and almost the same material as the paper itself. The notebook pages themselves are fine, with perforations in case I need to tear out a page. The subject dividers are the same size and material as the notebook paper, easily tearable, and they are also yellow, so it takes a bit of effort to even seek them out and grab them in order to switch subjects.

The notebook violated visibility, as in the dividers themselves were hard to see. The affordance of a cover is meant to be labeled, but they deny this from me. To redesign the notebook, I would make the cover one color, and put most of the labeling of the notebooks on the bottom and sides of the cover. The dividers themselves would be folders in order to save handouts.

2) Perceived affordances are false correlations made between actions and results due to wrong feedback or no feedback given. Actual affordances hint at their general use, like the grooves in a mechanical pencil. Physical devices allow for tactile affordances, and software affordances are primarily visual. Software affordances also rely a lot upon previous user experiences with computers, such as dealing with the desktop or the start button.

Cory Chen - 1/29/2013 17:22:58

1. By far the worst case of bad design I've seen is the shower/tub in my current apartment. I first time I tried to take a shower, I could not figure out how to make the water come out of the showerhead for 15 minutes so I gave up and took a bath. The problem was that the tub only seemed to have a knob for controlling the temperature. The switch for turning on the showerhead as not visible and therefore I had to try twisting and pulling every random part of the shower system. I tried pushing and pulling the knob for temperature, flicking the switch for the drain, pulling on the faucet, pulling on the showerhead, twisting the circle that protrudes out of the faucet, yet nothing worked. I asked my roommate afterwards to find out that you are actually supposed to pull DOWN on the circle that protrudes out of the faucet. The design of the system is utterly unintuitive because the circle/ring looks like an ordinary part of any faucet. Norman's principle of visibility was violated in the design of my shower. I would re-design it so that there is a lever that you can pull on the top of the faucet like we see in many other tubs. That way it would be visible and a normal person would try it eventually even if they hadn't seen that kind of system before.

2. Yes, there are differences. Physical devices can get affordances from the materials that it is made out of while software can only do that to a limited extent. Also, in software interfaces where an item is placed on the screen is often the most important thing about it (for example upper right corner is close). When talking about perceived and actual affordances, Norman means the difference between what we intuitively think of when we see an element and what the element can actually do

Alice Huynh - 1/29/2013 17:26:40

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

I would argue that microwave ovens are one type of appliance that still confuses me o this day. I will only use a microwave to set timed cooking, I have never used the pre-set buttons for certain types of food. For example let us look at this photo of a microwave:

tumblr_m2vrai0l531qetjcco1_500.jpg.

This microwave does not provide a good conceptual model nor does it have a natural mapping. I am referring to two specific buttons on this microwave: the “Express Cook” function and the “Stop/Clear” function.

Express Cook: The only notation to delimit the “express cook” buttons from the regular numbered buttons is a set of brackets. The user sees this bracket, but there is no natural mapping as to what the buttons do. The correct function is to press an express number and then press start, but without reading a manual a user will not be able figure this out blindly. The user is “operating blindly” as Norman states.

Stop/Clear: There appears to be only one button for two separate functions. Without prior knowledge of a microwave, any user will not know that the button requires a user to press twice in order to clear the time on the screen. There isn’t a natural mapping from the button to the function because it’s not apparent that a double press will lead to a clearing of the time.

Another button that I’m confused about here is the “Memory” button. As a user, I am confused just by looking at the button as to what it does.


2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

I argue that there isn’t a difference between affordances in physical devices and software user interfaces. In lecture, professor Hartmann showed us an example of the “recycle bin” in the Windows user interface. We know that the symbol represents a trashcan in the physical world that mirrors the affordance of being a place to throw stuff away. This affordance of the trash bin is used on purpose by the software user interface to utilize the perceived affordance. In this instance, the perceived affordance is the same as the actual affordance.

Another example to here differentiate between perceived versus actual affordances is the toilet bowl drinking fountain at The Exploratorium in San Francisco, CA.

192410515_1e8fd91f30_z.jpg

As a normal citizen, the toilet bowl affordance is for our urination and waste dispensing needs. Here at the Exploratorium, they exploited that affordance, by giving it a different “actual affordance” by making it a drinking fountain. A lot of visitors are averse to wanting to drink from the toilet bowl because of their “perceived affordance” beliefs.


Haotian Wang - 1/29/2013 18:16:24

1. My computer monitor, which I hook up to my laptop while I'm at home, has pretty bad design. I believe that the monitor violates the design principle of natural mapping, but not of visibility. The main complaint I have about my monitor involves the relationship between the 7 buttons on the monitor and the settings-menu of the monitor (the menu which adjusts brightness, input, etc). The buttons are labeled Menu, ch+,ch-, vol+, vol-, input, and power. As would be expected, Menu button brings up the menu, the vol buttons control volume, the input button brings up a list of inputs (vga, hdmi), and the power button turns the monitor on/off. However, once the menu button brings up the menu, the vol+/- buttons suddenly are used to go up/down entries in submenus, while ch+ is used to select an item, and ch- doesn't do anything. There is no logic to this mapping, and I still do not always remember what each button does when I'm using the menu all the time.

2. Some UI elements have affordances similar to physical devices, but other elements do not. The UI elements with similar affordances are similar because they imitate physical entities. For example, volume controls in Windows imitate physical volume controls on a radio, for example, where holding a dragging a lever on the radio would also raise/lower the volume. Some elements, like text-input fields, have no such physical analogy, but such elements are widely-used enough that people associate text-input fields with typing something.

Kimberly White - 1/29/2013 18:18:00

1. While not quite a device, the first "bad design" that came to mind was Dwinelle Hall, here on campus. It has entrances on multiple levels, inconsistent numbering systems for classrooms, and conventions that change mid-floor. The biggest issue with the design is probably the mental model, and the affordances of numbering systems. For example, if a room number starts with a 1, people assume it'll be on the fourth floor, not floor D. While the physical design of the building makes an interface (numbering system) more difficult, choosing a more conventional approach (label the bottom floor as 1 or 0, correlate floor numbers and room numbers, commit to one system for entire building, etc) would make the building more user friendly, and less maze-like.

2. Actual affordances tend to be related to a physical property, for example, it's hard to write on glass, but easy to write on paper or wood. On a computer, drawing something to look like paper doesn't actually affect the ability to draw, but users will still make the connection that paper is meant to write or draw on. While the physical traits are gone, the relevant ideas and conventions remain.

Si Hyun Park - 1/29/2013 19:07:58

1) BART ticket dispenser. BART ticket dispenser is one of the most painful device I have to use very often, unfortunately. I don't usually carry around cash, so I usually use credit card to buy the tickets. The problem is, when I use the credit card to buy the ticket, the ticket price starts at $20. Most people buy tickets of prices between the range of $5 - $15, so we have to tick down the price to the price we want every single time we buy the ticket. When we tick down the price, it only takes the ticket price down by 50 cents each, so if I want to buy a $5 ticket, I need to press the minus 50 cents button 30 times. What's more, the ticket that we get is non-reusable and flimsy, so it's easy to destroy them and creates lots of waste. I believe this design is a result of an "overuse" of Norman's constraints principle. The dispenser constrained users' actions too much - the keypad is unusable except for entering PIN, and the user only has an option to decrease the price of the ticket by 50 cents each click. To make the system better, I would allow the system to take custom price inputs. There is a keypad already for the vending machine, but the keypad is only used to take in credit card PIN numbers. It would be better if we can use the credit card to enter in the actual price that we want, so we don't have to start from $20 or press the minus 50 cents button 30 times. Also, I would change the ticket to a plastic reusable ticket that is much less likely to get destroyed. When the user returns the ticket, the ticket will be reused.

2) By perceived affordance, Norman refers to the quality of an object that the user expects it to behave. Actual affordance, on the other hand, is the actual behavior of the object that might or might not be the same as its perceived affordance. I believe that affordances of physical devices and software user interfaces follow the same principle. Most UI elements on software user interfaces carry a perceived action quality that the user expects them to behave in a certain way. In fact, some user interfaces (e.g. skeuomorphic) borrow real-life objects to mimic the affordances that such objects carry. For example, you push a paper upwards to see the paper underneath it. OS X application Preview has a similar interface where the user flicks upwards to flip a PDF page, revealing the page underneath (moving to the next page) When the user sees a PDF document framed in a page-like interface, the user expects it to be "flipped" just like the affordance that a real-life paper carry.

Ben Dong - 1/29/2013 20:03:44

Most stoves I have used tend to be poorly mapped. They are usually grouped in a 2x2 square, with the control knobs in a line either on front or on the side. While these knobs have little pictures indicating which burner they correspond to, it is often difficult to remember which know controls which burner. A better design would be to have the burners form a sort of trapezoid, with the controls in the center of the trapezoid. Then you could line up the knobs with the burners they control, and there would be an easy direct mapping.

The main difference in affordances between physical devices and software user interfaces is in how to directly interact with each. Whereas physical devices can be touched, software interfaces can only be interacted with via a touch screen, keyboard, or mouse. However, they are similar in that both physical devices and software interfaces often have an accepted affordance, e.g. doorknobs are meant to be turned and software buttons are meant to be clicked. Perceived affordances differ from actual affordances in that while some objects can actually perform certain actions, they are perceived to be either unable to perform that action or able to perform a different action. For example, using the example in the reading, wood is perceived to afford writing and carving, yet it actually affords breaking just as easily as glass.

Kate Gorman - 1/29/2013 20:30:31

1) A physical device with poor design is my mother's car keyless start/stop engine functionality. No key needs to be inserted to start and stop the car. There is only 1 button which functions as both the engine on and off button. This same button also allows just the car power to be turned on by pressing lightly (does not engage ignition). The biggest problem however is that there is *no feedback* to know whether the car is: 1) on with only power, no ignition 2) full ignition 3) fully off. Each time I try to turn off the car, I am unsure whether I am stepping out of a car that is still on or if the car is indeed off. Other cars have simply a little light on the same button which is illuminated when the car engine is on, and turns off when the car engine is off! This is a much better design. It could also be solved with visual feedback on the dash. This is never a problem with the old turn-key ignitions because you can see and feel where the key is in the ignition. 2) Physical devices can provide tactile affordances and are often unchanging, where as affordances of software interfaces can be dynamic and populate according to the tasks at hand, thus affording different tasks relative to the previous task, etc. Perceived affordance includes the idea that the user's conceptual model is taken into account when interacting with the object. This takes into account a user's familiarity with existing object and or interfaces, like if users are used to pressing buttons rather than swiping across a button, etc. Actual affordances are anything possible that the object can lend itself to or be utilized for, without taking into account how a user is likely to interact with a given object.

Cong Chen - 1/29/2013 22:43:24

One example of a physical device that has a bad design is my current point and shoot camera. On it, it has a dial that can be rotated clockwise or counterclockwise. However, you can also press down on it in the up, right, down, and left directions like a normal arrow pad. From the conventional way of using point and shoot cameras, to change the settings for flash, display, etc, you're suppose to press the up, right, down, etc arrow keys. My camera uses this function in addition to having the ability to turn the wheel in either direction. Though I have gotten used to this design, it is definitely not intuitive the first time you use it.

This bad design violates both the "make things visible" principle and the good conceptual model principle as the dial does not look like it can be pressed and thus, adds confusion as to how the user can change the flash and other settings. It does not agree with the conventional and logical way of operating a point and shoot camera as likewise, there are no other buttons on the camera that allow you to change the settings. To solve this issue, I would simply redesign it by changing the dial to a track pad of up, right, down, and left as it would make visible the ability to press in the direction to change the setting. At the same time, it doesn't affect a user's ability to navigate through the camera.

No, there is no difference between affordances of physical devices and software user interfaces. Some physical devices naturally imply how they should be used like the holes in the handle of a pair of scissors. The same thing applies to software user interfaces like websites or mobile applications. For example, given a list of classes or things to do on a mobile phone, it is natural for a user to want to click on them individually. Likewise, when a user scrolls in a mobile phone, it is natural to touch and move in the opposite direction of where you want to scroll. These are only some example of the many other types of software user interface styles that have affordances.

When Norman mentions "perceived" and "actual" properties of affordance, he is referring to the actual abilities of something versus how people often perceive what you can do with it. He gave the example of glass. Glass is actually clear so it's actual property of affiance is to see through it. People perceive this as a property of glass as well, but an additional affiance is breaking it as this is something people can do with glass and thus is a perceived affordance. In the context of software user interfaces, an actual property of a menu would be the ability to click on it while a perceived property would be the ability to navigate around the web application.

Zeeshan Javed - 1/29/2013 23:21:20

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

One example of an everyday thing that I have personally had frustrations with recently was the lid to one of my flu medicines. The lid lends the affordance of being twistable like a screw, but instead requires you to read the bottle that makes the person realize that they actually have to squeeze it first, and then twist. This medicine lid clearly violates Norman’s idea of Affordance that objects lend to humans. This medicine lid further violates the principle of visibility. Because the lid is round and has no areas for the human hand to grasp and squeeze it further misleads the customer. In many ways the mapping and feedback is lacking as well. As the customer twists the knob without squeezing, the customer is left confused over why it isn’t doing what its supposed to do. Customers may be misled to end up pulling or applying pressure to the wrong parts of the lid. Instead the customer must read the box and find that it must be squeezed uncomfortably with a lot of pressure, slightly twisted, and then pulled. This design worsens the customer’s condition intensifying the headache they were trying to alleviate in the first place.

2) Are there any differences in affordances of physical devices versus accordance of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

Software user interfaces are under the same constraints as far as affordance goes when considering design. The reason that I would say that there are no differences is because the consumer is still a human in both instances, and thus follows the same psychology of ease of use and design mechanics. For example, Notepad on Windows would have the affordance of being an interface to write notes, not watch movies. The Desktop would help relay the idea of a space to fit important files that require attention, or have tools layed out for easy access, not a compartment that is difficult to access on your computer. When Norman talks about perceived affordance he refers to the fundamental properties that determine just how the thing could possibly be used like the notepad or desktop. Moreover it is how a human perceives and thinks the object will be used. Physical affordance refers to built in ideas such as the computer with a keyboard, mouse, and screen. The computer affords pointing with the mouse, watching the cursor, and clicking desired tasks and objects.


Matthew Chan - 1/29/2013 23:47:22

1) A sink with a circular knob that doesn't really rotate. The sink turns on by pushing the knob upward (slightly strange, but still makes sense). Changing temperature requires pushing the knob left/right, awkwardly rotating it just a touch at the same time. A circular knob seems to afford turning moreso than pushing, so making it purely rotational seems more sensible. It can turn in only one direction with an arrow to represent this, and the water will go from cold to hot (since it takes time to heat the water anyway).

2) Touch is a very important sense. Although software can emulate physical affordances through appearance (e.g. buttons afford pushing), people still interact with it indirectly through hardware like a mouse. Even touchscreens lack the complete feedback of a physical button, making them just slightly less intuitive.

Erika Delk - 1/30/2013 0:16:55

1. This might sound weird, but I always thought that makeup containers were horribly designed. The worst is foundation bottles (foundation is the skin-colored goop you put on your face to even out your skin tone). Foundation is almost always in these tall skinny glass bottles, even though you usually put foundation on with your fingers (or maybe a sponge). To get any makeup out you have to shake the bottle upside down and I always end up getting it all over the place. Even worse, when you're running low you have to prop up the bottle upside down for a couple of minutes to get any out. I always wondered why those containers weren't designed like ketchup bottles, which are stored with the opening downwards and that you can squeeze. Another bad thing is eye-shadow. I don't even wear eye-shadow and I don't like the design. Most eyeshadows come with a color pallet and a foam brush to apply it. The brushes always get lost and if your pallet has more than one color you have to wash out your brush when you want to switch colors. Eyeshadow should be like lipstick, where the color is applied directly. These are like the glass doors in Norton's book; they were designed to look nice and not for optimal functionality.

2. While physical devices can generally afford to change the physical world (ex. you push a broom across a floor and clear the area of dirt) software does not (you can move something to your trash bin, but that change only exists on your computer). When Norton mentions "perceived" affordances he is referring to what we think something can do, this is different from an actual affordance, which is what something actually does. In terms of physical devices, the perceived and actual affordances are apt to be fairly similar. We all live in the physical world and have a lifetime of observing cause and effect relationships and as a result can predict fairly well what something is going to be able to do. Furthermore, it is easy to check what a physical device has done, we just take a look. With software however, the effects may be much less clear. The effects of an action are dependent upon the decisions of some designer somewhere, as opposed to physics.

Aarthi Ravi - 1/30/2013 0:25:53

My room heater is a good example for a bad design. The heater has features like a temperature controller knob, an eco-smart option, a timer and an oscillation feature along with many more features. I had to set the timer which would indicate how long the heater should be on. The default was set to 1 hour and I couldn't seem to change it. Much later, I realized that the temperature controller knob served a dual role of setting both the temperature as well as the timer. This clearly violated Norman's principle of visibility.The controls clearly lacked a natural mapping as the timer control was placed far from the knob. This made it less intuitive for the user. As a designer I would place the temperature controls and timer close to the knob to make it more intuitive for the user. I would also explicitly indicate the use of the knob with labels like deg F/ Hour etc. Another design issue was with the oscillation feature. The main unit had to be attached to a base support for this feature to function. The first time I used this feature the main unit toppled. Again, the instruction to lock the unit to the base was given at the bottom of the main unit and the visibility principle was violated. This could have been prevented if there had been a feedback sent to the display indicating whether the unit was locked to the base or not.

The affordances of physical devices are very similar to the affordances of software user interfaces. Controls on physical devices and UI controls on software user interfaces play the same role of assisting users to operate the device/ software. In the case of software user interfaces, Norman refers to properties of UI controls as the "actual" affordances and how these UI controls are used as the "perceived" affordances. For example, if we consider a Paint Application, the Brush tool is a UI control with an "actual" property to color diagrams and how the size of the brush is chosen is a "perceived" property.


Ben Goldberg - 1/30/2013 0:33:39

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

I can think of many, but the first one that comes to mind is my dad's refrigerator. It has a dial inside of the fridge to set the temperature of the fridge. The numbers on the dial range from 1 to 9. It does not indicate which way to spin the dial to make it hotter or colder. It is completely ambiguous. You might think that 1 is the coldest setting because the lower a temperature, the colder is it. But is it the other way around? It could also make perfect sense if the larger numbers were the colder settings. Adding to the frustration is the fact that you can't just test the setting the same way you can a light switch. There is no instant feedback. It's just awful any way you look at it!

Like I said, the design principle it violates is that it is ambiguous. There are many fixes for this problem, I think the simplest and most effective one would be to label the dial with the range of temperatures possible. That way if the fridge is set to 38 degrees and you wanted to make it colder, it would be very intuitive to turn the knob so the number would be lower than 38.

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

The affordances are different for software UI but they exist. If you see a button, you push it. A task bar gets dragged. Anyone who has used a computer has learned these affordances.

Perceived affordances are what you expect to happen when you perform an action on an object. Actual affordances are what actually happen when you perform an action on an object. If you have a form with a "submit" button, the perceived affordance is that it will submit the data to a database. Ideally you would like the actual affordance to match this perceived affordance.

Zach Burggraf - 1/30/2013 0:34:54

1) I'd like to take this opportunity to thoroughly complain about the thermostat outside of my room in the house I grew up in. One major principle of design is that the mappings of what sequence or combination of buttons perform what operations should be intuitive. In my case, I legitimately never could figure out how to set and run various programs the thermostat allowed for and would have much preferred a thermostat that simply turns on the heater until the room matches the temperature you set. Another problem with it was the feedback (or visual interface) was unclear as to what the readout on the screen meant. Does 80 degrees mean my room is 80 degrees or you are heating my room up to 80 degrees? Why not take the extra couple LEDs to distinguish "current temperature" and "desired temperature"? Also, just to cover the principle of visibility, I regularly use a Samsung monitor which doesn't have real buttons, you just kind of hold your finger up to the glass over the power symbol and it turns on. The problem is the power symbol and other controls are printed on dark glass in a dark colour so no one can ever figure out how to turn it on even in a well-lit room. You could make the symbols glow or reflect light at least but even then I'm not so sure it's intuitive to just touch the glass over the symbol.

2) Obviously there are some differences between physical and software as far as mechanics go but I think the same principle still applies just as well. Common interface components have affordances (i.e. buttons are for clicking, they don't do anything else. Sliders are for... sliding. And check boxes are for checking.) For example the perceived use of a slider is that it can be re-positioned somewhere in between two extremes, so many interfaces use them to allow for a user input of something that allows two extreme values with intermediate values. (i.e. speaker volume can be off, max or somewhere in between).

Winston Hsu - 1/30/2013 0:38:40

1.) The handle on the door of my apartment building has a very horrible design. To go outside you have to pull open the door, but the latch has a lever which you must push and hold to release. The result is that you must use two hands to open the door, one pushing the lever while the other is pulling the door. This violates several of Norman's principles regarding natural signals. The latch is built a large flat area as well as the wording "PUSH" as clear signals that it needs to be pushed. However directly above it is a fixed handle that displays signals it needs to be pulled. These conflicting signals cause confusion when operating the door.

2.) The biggest difference in affordances between physical devices and software is that while users generally have an intuitive understanding of physics and how physical objects behave, they generally do not understand what happens inside a computer. Perceived affordances refer to things that allow the user to understand functionality just by looking at the object. Actual affordances are how the object really operates. In good designs, the perceived affordances match the way the object actually functions.

Yong Hoon Lee - 1/30/2013 0:59:28

When I read the first question, regarding a physical device with poor design, I immediately thought of a particular watch. Norman discusses watches in the reading as an illustration of the paradox of technology, as watches became more and more complicated as they added features. While I have noticed this trend, the particular watch I referred to has a somewhat different problem. Last year, I attended an Oakland A's baseball game, in which they gave out A's branded sports watches (made by Deuce: http://www.deucebrand.com/). My fandom for the team in question made me excited to start using the watch, but when I actually sat down to figure it out, I found that it would not be practical for me to use. First of all, when one is trying to set the time, you must take off the watch and look on the back for two small metal buttons (IMG_2459_White.png). These buttons are not labeled, nor can they be pressed using one's fingers. Nowadays, because of the proliferation of digital watches, pressing buttons to change the time has become more intuitive and better mapped, in that one can generally figure out how to set a given digital watch with, say, four corner buttons. However, this is a completely new system, and in order to set the time, I had to read the instructions, which were confusing and included unnecessary taps of the buttons to confirm the time. Furthermore, the feedback was poorly mapped, as the watch was "set" when the display was flashing, not when it was solid. While I understand the logic behind this, namely that the flashing occurs every second, it was confusing to me. Finally, while it was an annoyance that the buttons had to be pressed using a pen, in general, one does not need to reset the time on a watch very often, so I did not consider this a large flaw. However, after a week or so of wearing this watch, I found that it actually ran fast, in that the time displayed at the end of the week was about 15 minutes later than the actual time. This defect, combined with the difficulty of resetting the time, made me abandon the product. Ultimately, this watch exhibits almost every failure of design that Norman identifies. Most prominent is that the two buttons do not have any affordances, other than that they should be pressed. However, they are not labeled, evincing a lack of visibility, and do not give enough feedback to know if one is actually setting the time, viewing the date, or just viewing the time, a paradox of technology as well as a lack of feedback. The mapping is also incorrect, as the buttons are not for setting the hour and minute, as one might expect, but rather one sets the mode or confirms, and the other scrolls through the times, violating the mental model one may have when presented a watch with two buttons. Finally, the watch is incorrectly engineered and difficult to fix, which, while not necessarily part of Norman's principles, can easily be seen as poor design.

In many ways, the concept of affordances of physical objects is very similar to affordances of software user interfaces, in that there are many elements of a user interface which afford specific actions. For instance, just as wood affords writing upon, text boxes (more generally, long rectangles which may have greyed-out text within them) afford user input of data. Furthermore, elements such as check boxes (squares) and radio buttons (circles) afford different features, namely selection of multiple options for the former and selection of a single option for the latter. One difference is that in user interface elements, their "affordability" can often come from repetitive usage of those elements (such as with check boxes and radio buttons) as opposed to actual physical constraints which are often found in physical objects. This difference, I believe, is exactly what Norman means when he mentions perceived and actual affordances. The affordances of user interface elements can be thought of as perceived affordances, as the objects acquire their affordances through a form of training. For instance, a user sees that every time he or she sees check boxes, they can select multiple choices; at this point, the affordances of check boxes has been established. However, with physical objects, often their chemical composition or physical attributes limit their affordances, as with glass. Transparency is a physical feature of glass that makes it easy to look through, and its brittle quality is something that was not simply attributed to it, but is rather something that it innately possesses. However, there are obviously physical objects with perceived affordances (for instance, expensive items of clothing can sometimes afford durability, regardless of whether the item is actually durable), and likewise, there are certain user interface elements that have actual affordances, though usually by evoking a physical object. One such example is the roller elements which are commonly used to input dates or times in calendar or alarm clock mobile applications. These rollers evoke physical wheels, and hence afford features such as a small number of options and the ability to loop back around. These affordances are perceived in some sense, but by creating the illusion that these elements are actually on a physical wheel, the affordances become "actual" in a more tangible sense. User interface elements which attempt to mimic physical objects can have "actual" affordances which can be stronger and can aid in creating a stronger mapping between the element and its functionalities. This can also go the other way, however, as an element may evoke a physical object but may not afford the features of that object, causing confusion. One such example is an eBook reader interface which does not support page turns by swiping. Users will note that the application looks like a book, and will naturally assume it affords "turning a page," but the user interface does not necessarily need to follow these assumptions, potentially causing problems.

Raymond Lin - 1/30/2013 1:07:42

At my apartment there is a desk that only lets you unlock it's side compartments when you release the main tray under the desk. To anybody who doesn't know about this trick, it's extremely difficult (to the point of breaking the desk) to open any other compartment besides the main tray. However, upon learning the trick, it seems almost tedious that you should have to open the main tray over and over again to access the storage areas. Norman's principle of visibility is clearly violated here, as a first time user could not have seen this little trick coming. Rather than having the main tray server as the unlocking mechanism, I would probably just put separate locks on each of the compartments so they could be independent of one another.

I dont think that affordance between physical devices and software user interfaces are different. For one, when we see interfaces like Facebook's homepage, we afford a gathering of friends and this is reflected in its News Feed capabilities, constantly updating you on the lives of your friends, as well as it's messenger chat bar, telling you which users are currently available for chat. I think what he means is that the perception of something whether it's a physical device or an interface should translate to its actual utility. So going back to the example of the chair, we perceive that we should be able to rest on it and that is translated into its 4-legged design to ensure its ability to support our weight.

Tiffany Jianto - 1/30/2013 1:21:42

1) One example of a physical device with a bad design would be an elevator I got trapped in. While my family and I were out at dinner a few years ago, my younger cousin and I excused ourselves to use the bathroom which was a floor below. On our way back, she thought it would be fun to use the elevator to go back up, and I went along with it. We pressed the button to go up, and the doors opened promptly. However, after we got in and pressed the button to go up, the elevator didn’t move. We tried a variety of buttons and none worked; we even tried the button to open the doors back up, but nothing happened. Luckily, I had my phone on me so I called my brother to press the button for the elevator from the outside so the doors could open back up again. After we got out, we found out that the elevator was designed for residents of the building the restaurant was in, and you needed to swipe a card to go up. I’m sure that whoever designed this elevator had great intentions in mind for the residents, but nothing informed us about this or gave us feedback on what was going on. Some elevators are wonderfully designed; however, this particular one failed to let anyone know how to use it-- even if I had been a resident, I would not have known to swipe my card because there was no hint to do so. This elevator violates the good design principle of visibility and feedback. There was no visibility about instructions to swipe a card and no feedback as to what we should be doing or if any buttons we were hitting were actually registering. If I were to re-design this, I would make sure to include some notice outside, and not even allow the elevator to open without swiping the residential card since once you’re inside the elevator, you can’t do anything without it anyway (not even open the doors!). Furthermore, I would give some feedback, so if a user tried to press buttons without swiping their card, they would somehow know to swipe their card, whether it was through a recorded voice or some kind of message displayed!

2) Yes there are differences in affordances of physical devices versus affordances of software user interfaces. For physical devices, because of the limited physical controls and uses, it may be easier to use, especially since you can physically touch and move the object around to get a feel for it; also, as he describes, scissors have affordances such as holes which clue the user to how to use the scissors, and even if they use the wrong fingers in the holes, the scissors will still work. For software user interfaces, affordances are a little different because there aren’t any physically intuitive ways of looking at software to discover how they work; one cannot touch or test, or move something around to figure it out. Therefore, software user interfaces must have a good design to afford their purpose to the user. In this way, perceived affordance is very important for software user interfaces, as the user must be able to perceive the affordance of a user interface, because there is no physical or actual affordance that can be determined. Actual affordances are the physical properties, including the shapes, materials, etc., while the perceived affordances are more of suggestions of how something may be used. It is much more intuitive to see the actual affordances of physical devices and the perceived affordance of software user interfaces.

Lishan Zhang - 1/30/2013 1:37:31

1) The knobs of the stove should be an example for bad design I have had to use. Usually there will be four stoves in the kitchen range. Two of them are in the front and the others are in the back. However, the knobs are always arranged in a line, which make people hard to identify which knob is used for the certain stove without reading the labels or instructions. In this case, the affordances fail. The design of knobs violate Norman’s design principle to have a good mapping and a visible structure. I think we should re-arrange the knobs based on the relative location of their mapping stoves rather than putting them in a line.

2) Affordances of physical devices usually have both perceived and actual properties, but software user interfaces only have actual properties because we cannot know the material of software user interfaces by just looking at them. In my opinion, “Perceived” affordances refer to the properties of the raw material for the device, and “actual” affordances mean that the properties of the device when the raw materials assembling together to some exact form. For example, the “perceived” affordances for a window should be a glass that can easily be broken down, but the “actual” affordances are the window that can be open and closed to receive fresh air from outside.

Shujing Zhang - 1/30/2013 1:52:55

1) One of the devices I use everyday and consider if very poorly designed is the alarm clock radio I use to set alarm clock everyday. The first principle being violated is that it is not constructed based on a good constructed model. There are almost 25 buttons, however, it still takes many steps to set the alarm clock. One has to go through many steps and even hold a button throughout the process to activate the alarm set mode. Also, the alarm set mode resembles the time set mode very much. I usually have to be very careful not to misuse the function. The difficulty and inconsiderate design is very tiring for me especially when I am very sleepy.

It also violates the principle of visualization and mapping. Every time I try to go the alarm clock setup mode, the only clue I can get is how the interface on the screen looks like, which is very similar to the interface of setting up the time. There is no obvious instruction telling me that the clock is currently under alarm clock set mode. Also, the buttons on the clock don’t describe what they do. I have conduct trials to figure out the first time I use them.

If I would to re-design the alarm clock radio, I would make a bigger screen, separate the buttons for setting up the alarm clock and those tuning the radio channels. On every mode, the screen will display which mode you are currently in. Simplify the setting up process and use more visual aids. I would also have symbols or names describing what each buttons do. If the button has two functions I would write them all.

2) The affordance of a physical device helps users to know how to do the task according to the design features of this object. For the software, the affordance would help user to understand how some function works and think about something in the application that designer intend them to think.

Perceived affordance is what users understand how things work from their point of view. Perceived affordance is very important in interactive design for software. Actual affordance is and affordance that can support user to actually do something, usually we refer to the physical objects for actual affordance.


Christina Hang - 1/30/2013 2:00:38

One device that I’ve had trouble with is the garage door remote control. This device should be fairly simple especially when you have only one garage door, but on the remote control there are three buttons with either no labels or has one dot on the first button, two dots on the second and so on. Although most remote controls are designed for homes with multiple garage doors, it should be clear that for a single door the first button should be used, but this is not always the case. Also, for multiple doors, one side is usually bigger and should be associated with the bigger button, but this is also not the case. I’ve had a couple instances where I had to try out the different buttons before I actually hit the correct buttons. Of course you can always program the remote to your liking, but the initial settings should clearly identify which button opens which door. Another issue with garage doors is that when you push the button once the door opens, then when you push the button again the door closes. However, in the case that you pushed the button while the door was opening or closing, then the door stops. Now, when you push the button again should the door continue what it was previously doing or go the other direction? I’ve seen garage doors where it depends on how long afterward you pushed the button again. If you push the button quickly afterwards it will continue its path, but wait too long and it will go the other way. The remote control violates Norman’s principle mapping because you don’t know which button controls which door until you test them. Also there are no labels that actually have any meaning to the buttons, because some buttons may not be for opening the garage door and maybe pushing it actually did something that you couldn’t see and you wouldn’t know. In order to fix this problem would be to include labels as to which button opens which door like “left” or “right” and having some instructions printed on the back of the control for how multiple pushes would operate. I don’t think there are differences between affordances of physical devices and affordances of software user interface, because both have to adhere to the limitations on the uses of the object. For instance, in software, a slider can only slide and not be pushed like a button, and a piece of wood can only block light from passing through and cannot be looked through. Perceived affordances means what are the typical uses of an object compared to the actual affordance, which is how the object is really being used. Take a bread clip for example, the perceived affordance is holding close the end of a bread bag, but the actual affordance could be holding the end of a roll of tape for easy access.

Brian L. Chang - 1/30/2013 2:18:39

One example of a device with a bad design is the comcast tv/cable controller. Did you know that you could turn on subtitles? Well you can, but you have to press three buttons at the same time to get to a secret menu to select captions. Then you have to turn off and on your cable box before the subtitles appear. (This is for cable boxes that came out before last year.) This controller violates several of Norman's design principles. It is not intuitive and it does not use intuition from other areas of our life. The subtitle feature is not visible and is pretty hard to find unless you have your computer infront of you and are able to google the solution. Lastly, there is no visible outcome of the operation. You don't see the results until the next time you turn on your cable box. This feature could be simply remedied with a subtitles button on the controller or as a option shown after pressing the setup button.

Well in general physical devices can be touched and therefore, usually have a different set of possible affordances. For example, in the article the chair affords support and sitting, but a user interface itself cannot afford support and sitting without some sort of physical device. The difference between "perceived" and "actual" affordances is that "perceived" affordances are what the user perceives as actions to the object while the "actual" affordances are the actual possible actions the user can take. The difference is perception vs reality.

Brian L. Chang - 1/30/2013 2:19:58

One example of a device with a bad design is the comcast tv/cable controller. Did you know that you could turn on subtitles? Well you can, but you have to press three buttons at the same time to get to a secret menu to select captions. Then you have to turn off and on your cable box before the subtitles appear. (This is for cable boxes that came out before last year.) This controller violates several of Norman's design principles. It is not intuitive and it does not use intuition from other areas of our life. The subtitle feature is not visible and is pretty hard to find unless you have your computer infront of you and are able to google the solution. Lastly, there is no visible outcome of the operation. You don't see the results until the next time you turn on your cable box. This feature could be simply remedied with a subtitles button on the controller or as a option shown after pressing the setup button.

Well in general physical devices can be touched and therefore, usually have a different set of possible affordances. For example, in the article the chair affords support and sitting, but a user interface itself cannot afford support and sitting without some sort of physical device. The difference between "perceived" and "actual" affordances is that "perceived" affordances are what the user perceives as actions to the object while the "actual" affordances are the actual possible actions the user can take. The difference is perception vs reality.

Elise McCallum - 1/30/2013 2:48:15

1) One poorly designed everyday physical device I have encountered is my car's side mirror adjustment knob. A single knob is designated to control both the left and right side mirrors, rotating them towards and away from the car. I have never successfully been able to move them. In its design, one must first press the level of the knob towards the correct side and then move either "up" towards the front of the car or "down" towards the back of the car. However, it is nearly impossible to ensure that the lever is both on the correct side and moving in the correct direction (as it is unclear whether moving the lever up will move the mirror towards or away from the car). This violates the design principle of mapping, as there is no clear connection between which way the lever is moving and the way the side mirror is moving. If redesigning the system, I would first split the knob so there are two levels, one on the left and one on the right. This presents a clear mapping as the leftmost lever is for the left mirror and the rightmost for the right mirror. The lever, instead of moving up and down, would then move from left to right (indicating either outside(left for the left mirror, right for the right mirror), away from the car, or inside(right for the left mirror, left for the right mirror), toward the car) so that the mapping is clearer. If one pushes away a lever, they are thus pushing away the mirror. If one pulls the lever towards them (to the inside), they are consequently drawing the mirror in closer to the car. I think this presents a clearer mapping that can thus solve the problem and ensure safer driving.

2) Yes, there are differences in affordances of physical devices versus affordances of software user interfaces. In the physical domain, the perceived affordance is dictated by what the device actually looks like, and the nature of how one interacts with the device. Can it be held, where are the grooves, what is it size, what is the raw material, etc. are all considerations that provide perceived affordances. Actual affordances are the ways in which the device can actually be used (i.e. a chair affords sitting). The difference now between physical and software devices is that software interfaces must reply largely on pictorial representations of the actual affordance. For example, the recycling bin icon actually affords deleting files, and its perceived affordance is that it is a bin and thus meant to have things tossed into it. Essentially, a software interface relies on symbolic representation of physical devices in order to utilize affordances, wherein physical devices and the way they are constructed determine their individual affordances.

Zhaochen "JJ" Liu - 1/30/2013 3:44:16

  • Question 1

When I go to a classroom, there are usually a lot of lights, controlled by a set of switches (usually around 4-8 switches). It is very hard to know which switch controls which lights. I always have to try multiple times in order to figure it out and finally turn on the lights I want.

It violates Norman’s design principles of mapping. The switch and the light should work well together but they obviously confuse a lot of people.

There are 3 ways that I can redesign this: 1. Label each light switch with name, such as ‘front’, ‘mid’, ‘back’, ‘whiteboard’. 2. Lay the switches in a diagram that maps to the actual layout of the light. 3. Have some buttons on the side that are able to turn the lights to some pre-defined mode. For example, a button may be called ‘lecture’. This button will turn on the lights that point to the teacher and the whiteboard but only turn on some of the lights that point to the students. An another button called ‘video’ will turn off the video that points to the whiteboard but leave some lights that point to the students on.


  • Question 2

In computer software, the user sees the objects through the screen. So, it is hard to feel the perceived affordance because you cannot feel the material. On the other hand, a physical object can provide people with more perceived affordance because Norman said that different material gives people some sense from a psychology aspect. He illustrated this point using examples, such as glass, wood and knobs. Physical object contains both perceived affordance and actual affordance.

Perceived affordances mean what people feel are possible actions. For example, smooth and flat surfaces are good for writing on. Actual affordances mean what can be really done to the object. For example, a fixed knob gives people the perceived affordances of turning. However, actually it cannot not be rotated.


Alvin Yuan - 1/30/2013 4:34:25

My apartment has this heater control that I do not understand. On: 2013-01-30%2002.25.34.jpg Off: 2013-01-30%2002.24.18.jpg It's a knob, and to turn it on you turn the knob clockwise. The only reason I know how to use it is because it's a fairly simple device in terms of number of actions it can do (off to on: warm to on: hot) and I can hear the heater turn on. It suffers mainly from providing a poor conceptual model of how it works. The device provides three cues: a nick in the knob, a circle of numbers on the knob, and a bump near the knob with “OFF” next to it. Unfortunately, because of the way the cues were set up, they provide almost no information. The circle of numbers rotates with the nick, so the nick is always in the same place relative to the circle of numbers. The nick can never actually point to the “OFF” bump either. The lightning bolt symbol conveys no intuitive meaning; I still don't know what it means. Together this creates a poor conceptual model demonstrated by the difficulty in answering the following: if I made the nick point opposite to the bump, which way would you turn the knob to turn the heater off? None of the cues would help you out, indicating that it really provides no coherent model of how it works. Many better designs already exist but here's what I would do: keep the nick or change the knob to be pointed; change the indicators to be “Off”, “warm”, “warmer”, “hot”, etc. indicating a temperature gradient from off to hottest; move the indicators off the knob so that the nick's position relative to the indicators can change, make the knob click or snap into each setting to provide some feedback to the user. This way, the nick combined with the snap clearly conveys which setting the heater is in and the settings themselves are clearer.

I think affordances of physical devices and software UI are actually very similar and have many parallels, which is partly why software UI often is made to mimic a real physical device. One difference I would say exists though is that physical devices can have actual affordances widely different from their perceived affordances, in that throughout time we have made numerous accidental discoveries and advances by finding new and unintended uses for certain items. UI's actual affordances are typically closer to their perceived affordances as they are rarely made with the flexibility that allows for accidental discoveries of new uses. In the context of software UI, I think perceived affordances would be what the user believes he/she is capable of doing within the UI, whereas actual affordances determine what the user actually is capable of doing. For example, seeing a text editor, a user may believe that the editor has copy/paste capabilities but in reality that is not a supported feature. Another example would be a user thinking that there's an input in the form of a continuous slider but in reality the slider snaps to discrete values.

Kevin Liang - 1/30/2013 4:43:39

1) I use a microwave almost everyday. If affordance counts as a concept, then it totally violates it. Microwaves were built to heat up food to desired temperatures. How do people use a microwave? They enter digits that signify how long to heat the food up for. A user should never have to calculate how many seconds to heat it up for. Instead, the microwave should automatically do it. If I were to re-design a microwave, I would just have a start and stop button. It should stop automatically when the microwave thinks it is hot enough.

2) I would say there are no differences in affordances of physical devices and software UIs. Both physical and softwares have user interfaces. They all serve a purpose. The difference is that one is electronic and one is physical. A perceived affordance is how might an object be used. Sometimes you may not get the desired effect in which the case is a UI flaw. What actually happens is the actual affordance. A good UI design should have the same perceived and actual affordance.

Brian Wong - 1/30/2013 5:00:13

I have this waffle maker back at home that seems simple at first glance. It simply heats up after plugging in so you can pour your batter inside. On the outside is a single light. The problem here is lack of visibility into knowing what that one light symbolized, for it actually had to represent many things. It needed to show when the waffle maker had power, when the waffle maker was hot enough to pour batter, when the waffles were in a state of cooking, and when they were done. There was no visibility for any of these four items because the light simply had two states, on and off, and it was difficult to deduce which state of light represented which state of waffle making. The easiest way to resolve these issues is to have a physical switch for On/Off, and then have an electronic readout that could display the state ('Heating', 'Ready', 'Cooking' (with time optional), and 'Done').

Physical devices have affordances that we think of often due to their material and their 3-dimensional shape. A desk, for example, is flat and large which affords space for placing objects on and writing/computing on. If the desk were to be made of cardboard, however, it would be afforded the property of not being sturdy, and being quite malleable instead. Software affordances are a little bit different because they are represented by pixels, and are two dimensional. However, this is often overcome by the ability to use symbols throughout software (for saving, printing, exiting, etc.) that have affordances associated with them due to their purpose in the real world. Most software can also emulate 3-dimensional space for objects such as a button (a representation of a real object) to simulate the affordance of being pressed. In this context, 'perceived' is often what one would assume a particular device affords due to its natural properties (such as material, shape, or symbol), but actual affordance is what it can provide, be it similar or different from the perceived affordance. An example is that wood might afford the exact same amount of physical protection as glass against the same force, yet they have different perceived affordances (for their typical uses), and thus get handled differently.

Tananun Songdechakraiwut - 1/30/2013 5:34:37

1. When I was on vacation in Seattle, I went to a music museum and inside there were long, weird-looking chairs in front of a big projector screen. In particular, the surface had a symmetric wave shape on which people could sit. Where were we supposed to sit then? Some sat on the wave peak and vice versa. Many people wouldn't even know they were chairs unless there was the big projector screen nearby and would think of some artistic figure. Thus, those chair violated a good conceptual model design principle. I would re-design so that on the chair surfaces, there were some sign indicating where to sit. Or use hard materials for the positions which were not meant for sitting.

2. Yes, there are. When one deals with physical, real objects, there can be both perceived and actual affordances. Also, they don't need to be the same. However, in software user interfaces, designers only control over perceived affordance. Note that even though monitor, mouse, and keyboard afford looking, pointing, clicking, and touching on every pixel of the display screen, the affordances mostly have no meaning. Perceived affordance is like what people perceive the things can afford but it doesn't need to be the same as actual affordance(what those things in fact can afford).

Sangyoon Park - 1/30/2013 6:20:26

1) I have a 27 inch LCD-monitor attached to my computer. This monitor has several control buttons (power on/off, menu, source, 3 more buttons for special funtionality) which are basically touch buttons. So these buttons work when i simply touch the button area with my finger. This monitor is such a bad design. First, everytime I wanted to touch to turn on/off the monitor, it is so easy to touch the button that I didn't want to press since the touch buttons are too close to each other and too sensitive. Second, I hardly use three special funtion buttons because it requires to install a software from the CD which came with the monitor, and I don't even remember where it is. This design violates one of Norman's principles, "The principle of feedback." The button is touch based and does not give me proper feedback (tactile feedback), so it gives me confusion everytime I use. I would not make the buttons as normal buttons to give users natural feedback - the sense of touching. And for the second thing with the CD, I think it is related to the paradox of technology. Having more and more functions in a device made users hard to use. 2) I belive there is a difference between affordances of physical devices and affordances of software user interfaces. I would say it is perceived affordances since the users should have different expectations from an instance of a software than it from physical devices if the instance of a software does not represent the exact same thing in a real life device. For the second question, "perceived" affordances mean how users think in their mental design of an object to figure out how to use it, and "actual" affordances mean how an object affords by its physical way.

Tenzin Nyima - 1/30/2013 8:56:26

The main entrance gate of the house compound where I live (which is a small wooded early 90’s door) is a good example of a bad design that I have to use everyday. The problem with the design of the door is that one has to guess whether to push or pull the door in order to open it. There, actually, is a sign that says "pull" but being 100 something years old, it is kind of hard for people to read the sign (And not to mention, as Norman says, "When simple things need pictures, labels, or instructions, the design failed"). The design of the door definitely violates the principle of "Affordance". It is not obvious for people to decide immediately whether to "pull" or "push" the door by simply seeing the door. This does not happen to me anymore since I use the door everyday but it does happen to my guests who seldom visit the house. As Norman pointed out in the reading, I would fix the door (gate) by attaching some sort of hardware that signals "pull" when people enter the house and a hardware from the other that makes obvious for people to "push" the door when leaving the house. I don’t see much differences in affordances of physical devices versus affordances of software user interfaces. For example, as I said in the above paragraph (where I talked about affordance of a physical device - the door), a very similar situation could happen in the software UIs. Sometimes we come across softwares that are hard to operate (even the simple things) and some other softwares are great - making our life much easier without putting much efforts in reading all the long manuals to operate simple things. As the hardwares on the door signals people whether to push or pull, buttons and warning signs in software UI signal users to click or not to click. By “Perceived” affordance, Norman means the perceived property of the thing - for example how to use the door. And by “Actual” affordance, Norman means the actual properties of the thing - the door being a solid and firm.

Tenzin Nyima - 1/30/2013 9:10:44

The main entrance gate of the house compound where I live (which is a small wooded early 1900s door) is a good example of a bad design that I have to use everyday. The problem with the design of the door is that one has to guess whether to push or pull the door in order to open it. There, actually, is a sign that says "pull" but being 100 something years old, it is kind of hard for people to read the sign (And not to mention, as Norman says, "When simple things need pictures, labels, or instructions, the design failed"). The design of the door definitely violates the principle of "Affordance". It is not obvious for people to decide immediately whether to "pull" or "push" the door by simply seeing the door. This does not happen to me anymore since I use the door everyday but it does happen to my guests who seldom visit the house. As Norman pointed out in the reading, I would fix the door (gate) by attaching some sort of hardware that signals "pull" when people enter the house and a hardware from the other that makes obvious for people to "push" the door when leaving the house. I don’t see much differences in affordances of physical devices versus affordances of software user interfaces. For example, as I said in the above paragraph (where I talked about affordance of a physical device - the door), a very similar situation could happen in the software UIs. Sometimes we come across softwares that are hard to operate (even the simple things) and some other softwares are great - making our life much easier without putting much efforts in reading all the long manuals to operate simple things. As the hardwares on the door signals people whether to push or pull, buttons and warning signs in software UI signal users to click or not to click. By “Perceived” affordance, Norman means the perceived property of the thing - for example how to use the door. And by “Actual” affordance, Norman means the actual properties of the thing - the door being a solid and firm.

Achal Dave - 1/30/2013 10:02:43

1. There are about 5 or 6 controls on my chair, which isn't much, but after setting it up, I spent some 15 minutes with an instruction manual (I did not know they'd started providing these with chairs) after having wasted another half hour or so before hand, because who reads the manual? To this day, I'm still unsure of what 4 of those controls do. There's a knob which is supposed to control how tight one particular motion of the chair is, but I have no idea which--is it something to do with the "lumbar support" control, or is it to control how far back I can lean on my chair? I've tried messing with it multiple times, and still haven't figured it out. The instruction manual is not clear. Also, the chair rolls away as soon as I try to sit on it. I've resorted to placing a large piece of cardboard underneath it just to keep it steady.

There is no mapping between the controls and the physical thing that they do, except in one case: raising and lowering the chair. The knob is circular and faces up, but it does not have anything to do with the chair's rotation.

There is no immediate feedback from some of the controls. The only way to figure out what they do is to keep trying them (or read the instruction manual, which still is not much help) and hope you notice a difference.

If I were to redesign it, I'd try to label the controls on the chair itself, and place them at more logical places. Ideally, the chair could simply have an electronic method for doing things via a remote, though that would greatly increase the cost. Within that constraint, it makes sense to put controls near their action points, and to orient them in a way that maps to what they're actually doing, similar to the raise/lower seat control.

2) There are some differences, but not many. In software, our perceived affordances come to some extent from skeumorphism and learned behavior. For example, we know that anything that is blue and underlined is likely a link, no matter where we see it (learned, but almost native now). On the other hand, buttons often have a drop shadow that indicates they are raised, and can be pushed down (skeumorphic). In this case, the perceived affordance comes from such design (the blue, underlined link) that indicates the actual affordance of the action (this is clickable).

Avneesh Kohli - 1/30/2013 10:18:52

An everyday thing that I consider to have bad design (though maybe it’s just me) is the lock on a doorknob. I personally thing it has poor design because it lacks the principles of mapping and feedback. When I’m locking a door, I seem to always forget which way I need to turn the key to lock the door. While this isn’t so much a problem with the bottom lock (as I can simply turn the knob to see if the door opens), the top lock is more difficult. This design lacks mapping but the actions of locking vs unlocking are not at all indicated by by the lock in any way. Once inserting a key, you can do one of two things: turn it right or turn it left. But without experimentation, it’s difficult for you to tell which one does what. Additionally, this design lacks feedback. Once you’ve made a decision to turn the key right or left, there isn’t a way of immediately knowing if your action was successful. The only way to successfully tell if the upper lock was correctly locked is if you unlock the bottom one and try to open the door. Because there isn’t a knob like there is for the bottom lock, there needs to be a more direct way of indicating whether an action was successful. To remedy these problems, I’d suggest a small indicator light to show when the door is locked, and 2 icons with arrows on the top of the lock, to indicate which way to turn the key to lock or unlock the door.


The biggest difference I see in affordances in physical devices versus affordances in software is that the number of things software interfaces can afford are significantly less than physical devices. For example, a chair has a limitless number of affordances (certainly, a restricted few logical ones), whereas a component of software can really only afford itself in a limited number of ways. That said, the basic notion of affordances on physical devices vs software UI is more or less the same. A 3D button in a software lends itself to being pushed or clicked, just as a metal plate on a door suggests it be pushed by the user. Regarding perceived versus actual affordances in software, Norman means that perceived affordances are properties and use cases that a component is likely to suggest, whereas actual affordances are all possible properties and uses cases for a particular component. In some cases, these can be the same, such as a single button on a modal dialog box; it only suggests one thing, and you can only do one thing with it.

Mukul Murthy - 1/30/2013 10:48:23

1) The lighting controls in the Embedded Systems lab (204 Cory) have are poorly designed because the mapping is terrible. The lab is roughly a rectangle, with strips of fluorescent lighting that span the width of the lab. There are enough lighting strips to cover most of the length of the labs. And when all lights are on, the room is well lit. The problem is that the 6-10 switches that control all these lights aren't mapped well to the lights. Some of the switches control a region of the room. Others do things like turn on every alternate light. Others turn on certain lights at lower brightnesses. It takes several tries to get a combination of lights you are looking for, and even that is not always possible.

The lighting controls have good visibility; it is clear where all the switches and lights are. The feedback is also good; one can instantly tell what change was made with the last switch they flipped. The only change I would make would be to the mapping; I would make the leftmost switch correspond to the farthest two strips of lights in the room (the left end of the room while facing the switches), the next switch correspond to the next two strips, and so on. I would make the last switch a "master" switch that turns on all lights, and make this clear by either labeling it or replacing it with a larger switch so anyone could tell it had more importance with the others. And if less-than-full brightness is required, I would add in a dimmer to control the levels, instead of arbitrary switches to dim certain lights.

2) I think people are are more familiar with affordances of physical devices than with affordances of software UIs, because people tend to have less experience with software UIs. There are still some affordances with software interfaces, however. On any toolbar with standard options like File, Edit, and Tools, almost everyone would expect options like New, Open, and Save under File and options like Copy and Paste under Edit simply because we have learned to expect these things out of any software interface. But these are all learned affordances; no one has to sit on a chair to realize that it looks like something that could be sat upon.

Perceived affordances are what people think are the properties of the device, and actual affordances are the real properties of the device. To use an example from the reading, despite being an actual affordance of most phone systems, the Hold button is not a perceived affordance of modern telephones because people don't know how to use it or where to find it. Software interfaces tend to have simple perceived affordances, but many hidden properties that people don't figure out until much later. For example, Microsoft Windows has several hotkeys and functions used for navigating windows effectively, but most people know only a few of these or navigate solely with the mouse.

Matthew Chang - 1/30/2013 10:55:28

1) I have a water boiler at home which operates on a very simple principle where when it is plugged in, it is either boiling water or keeping it warm, but the process of pouring out the water is somewhat dangerous. The water does not flow out of the nozzle smoothly and the indications on the screw-in cap only indicate a direction to turn it to allow the water to pour out. There are no markings for how far to turn the cap to allow the water to pour out. This leads to possibilities of burning oneself with hot water.

2) There aren't many differences in the affordances of physical and software user interfaces. In software interfaces, there are many uses of visual cues used to reference the user has in real life, such as the materials used in the structures for Angry Birds and the paper-like background of many note-taking applications. Perceived affordance is something that is like the utility of something derived from non-interactive observation. The best example are buttons, where they may be labelled with only some of their actual functionality. Actual affordance tends toward directed purpose features that have defined utility that tends to mimic what it does.

Sumer Joshi - 1/30/2013 11:11:21

1. Some cars have bad design due to the lack of labeling buttons. A/C is a universally understood button. But sometimes for the defrost button or heat button I confuse the two. In my Accord, both buttons have the same coloring, but if you look closely, they do two different things. This violates visibility because you are not sure which one you are pressing unless you memorize it. Also, if I have to switch the music between the front of the car and the back of the car, I need to pull a knob out and tinker with it in order to get more "fade". I did not know I needed to do that until I started playing around with the knob to either get more bass or treble. This is also a visibility issue. For the music, I would have a knob for just treble and bass, and one knob for fade.


2. Norman indicates that actual affordances are looked on by actually trying out or testing out, while perceived is what we think of these affordances. He gives the example of affordances of materials: because vandals see glass, they are more likely to break it. I don't think there are any difference in terms of affordances between physical devices and software UI's because both have percieved affordances that the creator of the UI or device sees for it and hopes that these affordances match the actual situation of the user.

Claire Tuna - 1/30/2013 11:13:14

1) The problem: This is the floorplan of part of my apartment: screenshot20130130at102.png There is only one way to enter the room with the light switches (the bottom left doorway in the drawing), and from that doorway, there is a close switch and a far switch. The close switch turns on the light in the current room. The far switch turns on the light in the room on the right. There are a number of things wrong with the design. Imagine that you are standing in the doorway dividing the two rooms. One switch is to your left, and one switch is to your right. How do you know which one lights the room in front of you and which one lights the room behind you? There is not a natural mapping between the controls and their function. Another issue is that the light switches are so far apart. More often than not, I want to turn both on or turn both off, and their distance makes this a pain. Not to mention, the convention is to have light switches in a row. Finally, the far light switch is out of the way. It is not on the natural walking path (indicated by the arrow). The only reason anyone ever visits that wall of the room is to reach the light switch. The solution: There is no function to the room on the left; it is effectively a doorway to the right room, which contains a table that I eat and study at. In my opinion, it doesn’t need an independent light. The only reason its light gets used is to supplement the light in the room on the right, which is too dim by itself. I would get rid of the far light switch all together and link the two lights, so that turning the close switch turned them both on/off.


2) I think the metaphor of affordances translates pretty neatly from physical devices into software interfaces, especially with touch screens. Buttons afford pressing. Drawers afford pulling. However, there aren’t as many rich options with software interfaces as their are with hardware devices. For example, Norman mentioned all of the things you can do with glass and with wood. There aren't so many things that you can do with a button. When Norman mentioned “perceived” vs. “actual”, he meant that users carry with them baggage about certain interfaces and materials. His example was wood = writing surface, glass = breaking surface, even though both were equally breakable. A perceived affordance is subjective to the user and his/her associations. In the realm of software, a perceived affordance would be “buttons are for tapping, sliders are for sliding”. We have these perceptions because of past experiences with buttons and sliders. The actual affordance would be the range of actions the slider or button truly allows. In good design, the perceived affordance = the actual affordance.

Joyce Liu - 1/30/2013 11:18:23

1) My roommate has an electronic scale that is perhaps one of the most sleek scales that I have ever laid eyes on. It’s pretty fancy and has quite a few features more than your average scale—I think it can keep track of your weight over time. The reason I say “I think” is because I can’t figure out how to use all the other features other than basic weighing. The surface of the scale is made of glass, and you turn it on by tapping the surface. After the screen on the scale sets itself to zero, you can step on top of the scale and weigh yourself. The principles that this device violated include the following: First, visibility is an issue. Although there are multiple functions that can be performed with the scale, these functions are invisible to the user. There is nothing to remind the user of the various functions the scale is capable of. There are more functions than controls, which make mappings between controls and functions difficult. Assuming that for this scale the user can activate the different features by tapping at different places at different frequencies, I would redesign this scale by placing lettering or pictures representative of the functions on the glass at the trigger areas for the features. These words or graphics would act as a reminder about all of the features while not being completely intrusive to the sleek design. I would also make the screen on the scale bigger because right now it’s about the size of a business card, but given that it tracks weights over time, it would make sense that the screen is bigger so that it’s easier for the user to visualize his/her weight over time.

2) When Norman mentions “perceived” versus “actual” affordances, “perceived affordance” refers to what the user perceives or thinks that the product can do, whereas “actual affordance” refers to what the product is capable of doing in reality. There are differences in affordances of physical devices versus affordances of software user interfaces. Although both physical devices and software user interfaces have affordances that come naturally and affordances that require a bit of learning, software user interfaces have more affordances that require a bit of learning since it’s not necessarily part of everyone’s daily lives. My mother seldom asks me about how to use physical devices, but she often comes to me to ask about her iPad. Additionally, affordances come from our experiences interacting with objects in the world, and we come to understand our experiences through embodiment. Since we are directly interacting with the physical world, perhaps we are able to perceive more affordances with physical devices (when they are designed well). Another point to consider is that software use interfaces often mimic the real world, so in a sense the affordances are not drastically different. Instead of pushing a button, it now becomes a click. Instead of flipping a page in a book, it now becomes a swipe. Thus to maximize perceived affordances in software user interfaces, the designer should look into what the user already knows or does in daily life so that the user can map the same affordances to the interface.


Edward Shi - 1/30/2013 11:27:44

I find that current microwaves are surprisingly complicated. Before it was simple, all you needed to do was to set a time and press start. Now pressing numbers 1, 2 or 3 could suddenly jump to some quick cooking method with various functions for defrosting or roasting, none of which is clear. Most microwaves also now come with power levels and sometimes it isn't clear which power is being used and if I don't specifically select a power level, I don't know which power level is used by default. I believe that the main problem with the microwave is visibility. Most of the time, the power level is not indicated on the digital screen when the microwave is cooking so you do not know what level it is at. Microwaves are designed to be compact and many times they feature just the numbers but the numbers don't indicate that they are quick cook or defrost. I also boldly assume that the designers did not bother to collect much feedback for the microwave as my friends also struggle with using the microwave. Due to the lack of visibility , very few of us are even aware of the various special functions in the microwave. If I were to re-design the microwave, I would mainly focus on visibility. Instead of trying to make as few buttons as possible, I would make more buttons that explicitly state their jobs. I feel that the number buttons should not have any shortcuts to those as numbers in our mental model serve specifically as numbers for a timer or numbers for a power level. If there had to be quick cook or defrost, there would be separate buttons for them. Perhaps there can be a special function button area where there can combination for specific functions and there can be a sticker on the side that list all the specific functions. I would also ensure, if there are power levels, to display what power level the microwave is at in the digital display. I would also display whether or not the microwave is on some special mode. The affordances on physical devices tend to convey specific physical meaning. For instance, buttons are normally for pushing, knobs are for turning, levers are pulled etc. However, software affordances are more functional. For instance, the red button on browsers is now associated with closing.I feel that software affordances also paint this mental model because they may not necessarily be intuitive but learned and habitual. I feel that in physical devices, perceived and actual affordance are normally consistent. For instance, few people would use glass as something that is going to take a lot of force. However in software, perceived affordance is not natural but more learned. It is once we learn that the red button normally means to close a window where we have the perceived affordance. So actual affordance is more of a design choice as opposed to an inherent property of some material.

Bryan Pine - 1/30/2013 11:29:49

1) In the Foothill Academic Services Center, they have a real problem with the light switches. First of all, there are roughly 10 of them, and none of them are labeled with what they control. There is one by the door, which seems to turn on about half the main lights (at least that makes sense), and then 2 on one side that control most of the rest of the main lights. In the "back" (other side), there is a panel with 7 switches that don't really seem to do much when you move them. I think some of them turn on outside lights, and some turn on lights in other rooms, but one in particular is particularly bad: it turns on some soft lights behind a desk on the other side of the room (too far to actually see). This violates the concept of a natural mapping; one would think that the switch to turn on those behind-the-desk lights would be near the desk, or at least with the other switches located on that side of the room. In fact, the person in charge of the center told me the lights once remained on all spring break because no one could find the switch. This switch (and the panel in general) also violate the principle of feedback, that you should be able to see the results of your actions. At least with most light switches you can tell what lights they work; not with these.

 If I were able to redesign this system, I would first of all label the light switches (that would probably be the simplest fix to get them to at least usable).  If I had more time, I would rearrange the switches to put them close to the lights they effect, and at least on the same side of the room.  I also wouldn't have 7 on one panel, because that is just overwhelming.  Maybe even some color coding between lights and switches to create a natural association?

2) I think the concept of affordances for is fundamentally the same for physical devices and for software; different materials and layouts are "for" things in both the physical and virtual space. For example, consider Norman's example of how wood paneling "affords" writing rather than smashing. The same concept applies in software: if you want the user to enter text on something, make it look like a piece of paper, not a button (those afford clicking). One difference with software is that software interface affordances are usually merely perceived rather than actual distinctions. In the physical world, wood really is easier to write on than glass, and it would be extremely difficult to write on a sphere (that is more of an actual affordance). With software, there really isn't any reason that you CAN'T write on a button, other than the perceived idea that something that looks like a button isn't supposed to be used for that. Of course, that doesn't mean perceived affordances aren't important in the physical world, or that actual ones are irrelevant to software design. Vandals could still smash the wood but choose not to (perceived affordance), and it would be very hard to enter black text on an all black background (actual affordance of a black button). I just mean that perceived affordances may be more common and influential in software design than in the physical world.

Moshe Leon - 1/30/2013 11:56:15

1. An example from my own home for an instrument with poor design is my electric teapot. I love instant coffee, and the little time it takes to make. Put water in the teapot, push the button down, and after a couple of minutes- behold: the water boil. Magic? No. Good design. The teapot resembles a teapot, it has only one button that needs to be pushed down to boil- simple and straight forward. Once the water boils it stops working. The problem began when I had to buy a new teapot- the old one just decided it made too many cups of coffee, and tea. I looked for the simple design I used to own, but alas, I could not find one. What I did find, was a state of the art, mechanical instrument that had the capabilities to measure heat, boil water to many temperatures and announce once it is ready, and the water have boiled. It was fairly cheap, not more than what an electric teapot of the old simplicity design used to cost. Once I unpacked it, I filled it with water, and pushed the button, which read “boil” on its label. I waited, and after a few minutes nothing seemed to happen. After messing around with it and pushing various buttons, eventually it started to react, and the water boiled. This hellish scenario repeated itself over and over again for a few weeks, before I finally understood that I actually had to press my choice of temperature, and then the boil button, in that order. Today, I have no problem with the device, and I actually like all the neat features it provides, but it could really have been less sophisticated, for all I care. I almost gave up on it and turned to the regular, stove-top teapot. The “Norman design” principles violated were the “visibility”, the “paradox of Technology” and “proper Feedback.” According to Norman, in order to achieve proper Visibility, “the correct parts must be visible, and they must convey the correct message.” The new mechanical teapot was lacking in instructions. How would I know which button to push first, or last? According to Norman, “Manuals can solve many visibility and other issues”, however, the manufacturers seemed to have reasoned with themselves that their new mechanical teapot was simple enough to be understood by the average user, and they did not provide a manual. The paradox of technology came into play here, since many new features can be added, and were added to this mechanical teapot, however, I was doing just fine without them, and their new presence only added to my frustration with the teapot. If there was a proper feedback when I was trying to boil the water, I would probably at least understand when I was not successful during my many tries.


2. First of all, it is important to explain what affordance is, according to Norman. Norman claims that “affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used.” Affordance is what our mind associate with a material, or an object, and the “list” of all possible usages that can be made or done through or with that object/material. Norman’s example with the wood vs. glass was perfect: when vandals saw barrier made out of glass, they smashed the glass, since this is one thing that one can do to glass. When the barriers were made of wood, the vandals drew graffiti on them, because one thing that one can do with wood surfaces is to draw upon them. Norman calls it the psychology of materials. It is all understood better through the conceptual model, which helps us understand what things are. It is a strict methodology of simplicity which must be adopted, and the creation must aspire for appearance that would resemble a tool of a similar usage- to prevent confusion. It is almost like platonic forms- when a tree stands for all trees in our mind, and when we see a tree we categorize it as a tree, if it fits the form we have of a tree. Our mind tries to categorize things, and that helps us use or understand things that we have never seen before, just by seeing something similar. So when we categorize something as a tree, we will treat it as a tree. Same goes to mechanical and non mechanical devices. Once we see a resemblance to another tool that we know, we know how to use it just by categorizing it as what we think it is. In a similar manner, affordance comes into play when certain objects or materials are associated through their possible usages, which we, humans, know in our minds and categorize so well. Perceived affordances are all the actual things one can think of and associate with the material the device/object is made out of, and the actual affordances is what the material in its current form was meant to be used for. According to Norman, “When affordances are taken advantage of, the user knows what to do just by looking: no picture, label, or instruction is required.” The only difference between affordances of physical devices versus affordances of software user interfaces that I can think of is through the example which Norman provides. Once instruments of mechanical nature are used, and a bad result is produced, it is human nature to associate the failing process with that person, and not with a faulty design. A bad action is more perceivable rather than a bad program. While physical devices have their appearance and origin of manufactured material to have it associated with the affordance phenomenon, software user interfaces may have similar results, but in a different way. Software produces feedback, and very often the feedback is not immediate and so an action can be repeated, out of frustration, numerous times, only to be discovered of the delay’s presence too late. By then, the action has been used numerous times; a fact that can be disastrous in some instances. Software interfaces lack the natural materials that real items posses, however, they are still composed of 1’s and 0’s, or the programming languages that help their creator in making them a real thing. If I was to compare Norman’s idea of potential vandalism, then all software is prone to hacking, and mal usage through backdoors and faulty design. This may help to explain the widespread phenomenon of hackers, or people whom produce malicious software to hurt other people’s programs and software. Software perceived affordance is limitless, while a real object might be restrained to a limited number of usages.


Alysha Jivani - 1/30/2013 11:58:10

(1) My 3-hole-punch is a little difficult to use because it is often hard to tell whether the paper is placed properly (so sometimes the holes are lopsided or closer to the edge of the paper). I think this is due in part to a violation of the principles “make things visible” and “feedback” (although you get feedback once you remove the paper, it would be nice to know beforehand). Currently, it is hard to see the edge where the paper is supposed to line up because the parts of the hole-puncher are so close together and it’s dark. Spacing them apart would make it easier to see if the paper has been placed correctly. In terms of both feedback and visibility, it might be helpful to redesign the hole-puncher with a thin strip of light (or even some sort of brightly colored strip of metal) that the sheet of paper has to completely cover. That way, if the paper is placed correctly, the strip is covered, thus providing visual feedback.

(2) I think there are differences in affordances of physical devices versus of software user interfaces. For example, physical devices can often take advantage of material and actions in three dimensions and often times, it is natural to think about how the object is meant to be used. With regards to software, while multi-touch is becoming more and more popular, you tend to lose that material affordance and some of the 3D motion (though accelerometers and such are bringing the 3D aspect of motion and use back in). I think software user interfaces rely on more of the “perceived” affordance (i.e. it is not a physical property or mapping, like turning a physical knob). For example, a very common UI screen that we see quite often is the “slide to unlock”. This has become “natural” to us in that it is a perceived affordance. When we see it, we know how to use it even though a horizontal sliding motion like that could not be applied correctly in other situations (i.e. it is not an “actual” affordance of the physical object).

David Seeto - 1/30/2013 12:04:01

1) An example of an "everyday thing" with bad design that I use is my desk chair. Although it affordances support for me to sit on, its two levers and one knob are neither visible nor map naturally to what they actually do. Take the lever to adjust the seat height for example. First, the two levers are not marked or positioned in any particular way so distinguishing them become an early problem. Once found, there is only one way to pull, the difference being that adding weight to the chair makes it go down and vice versa. However, as a first time user looking for the lever, a person would not be actually sitting down as they'll need to get off to find the lever itself. This unnatural mapping of one lever to two functions is an easy problem to run into. In addition, the knob attached at the base of the seat turns clockwise and counter clockwise, but only possible at a very slow rate since the knobs are difficult to turn. To this day, I do not know what it does. Part of the reason for that is the lack of feedback when turning the knob. Another reason is that the knob, and levers in fact, follow an arbitrary mapping. There is not readily a mental model from which we can infer what they are for. I would redesign the chair to be more like a car seat where levers are position to where you think angels should be adjusted. A lever is under the seat when you want to pull the chair forward. The lever to recline the seat is to the side, angled in the way a car seat naturally is.

2) There are both similarities and differences in the affordances of physical devices and software user interfaces. Take for example, the desktop icons and layout seen in lecture one. Things like the recycling bin banks on the idea that it looks similar to a trash can in order to imply its use; there is crumbled paper, a recycling symbol, etc. Also consider a button widget for android development. When pressed, shadows form around the button as to give it a pressed look. In this, it affords for it to be pressed. However, this leads to the differences, the difference between "perceived" versus "Actual" affordances. User interface software tends to have perceived affordances. Buttons are made to look like they can be pressed, volume control scrolls up and down with a knob, analog clock widgets can be placed on the desktop, etc. The difference is that physical affordances are implied by the physical attributes. Take for example, his description of paper, "flat, porous, smooth surfaces are for writing on." The button is an example of the perceived affordance.

Linda Cai - 1/30/2013 12:05:26

The microwave in my apartment has a difficult and confusing design with several buttons that were clearly designed to be used on a regular basis, but are ignored for the most part by residents in the apartment. There are six buttons for ‘commonly cooked foods’—popcorn, potato, pizza, frozen vegetable, beverage, dinner plate. When you press one of them, a number appears on the display screen. However, if you press the button again, it will show a higher number. By pressing the button, you loop through some numbers, but it does not actually indicate that the number is actually the ‘power’ or measure of temperature at which you wish to cook the food. This violates Norman’s principle of visibility and does not provide a good conceptual model. Several functions are mapped to one button, but it is not intuitive that the button has several options and it is ambiguous what the numbers actually mean—it could indicate the amount of time you wish to cook or temperature, neither of which are obvious, since the number is usually in the range of 1-20. The button affords temperature, but the perceived affordance is vague for new users who did not read the manual. Another button on the microwave, ‘power’ seems like it would adjust the temperature for normal use (simply setting a timer to cook the food), but any combination tried with the number pad gives no feedback, making it extremely difficult to adjust the microwave to ‘high’ as many packages require. To fix the problem, I would remove the function from the preset foods, so that pressing the button multiple times will do nothing. In other words, I would map the foods to only one function (a timer that is suitable for the food selected). Top change the temperature, one could use a button called “Temperature”, and after the food is selected, pressing it would loop through “low”, “medium”, and “high”, and instead of a number for the temperature, the system would use the temperature mode and food selected to determine the correct temperature needed for the food.

Physical devices have physical and spatial elements that give more opportunities for the design to have a natural mapping. They also have different types of constraints. For example, for floppy disks there is only one possible way to insert into the computer due to the physical constraints set up by the design, and for scissors similar constraints make it natural for users to learn to use the finger holes. Affordances of physical devices can be very clear through good use of meaningful mappings. However, for software user interfaces, most of the time the affordances are vague and vary from software to software and mappings between perceived actions and actual actions can be very confusing and arbitrary. Analogies are often made between software and physical objects (e.g. office desk model) to make the software more natural to use, but due to limitations of software, arbitrary sequences often have to be memorized to complete a certain task. Right clicking to get to menus, swiping left or right to get to different icon shelves on smartphones, pressing either a square button or -/+ to be able to adjust a software’s ‘window’ size. In this context, perceived affordances may be what the user thinks a certain action will do in the software, and actual affordances are what the software actually does when the user does the action. For example, on the iOS, a user may press an application icon and hold too long rather than letting go quickly, and instead of entering the application, the icon shelf wiggles and delete-able applications with show an ‘x’ indicating to press it if you wish to delete the icon. Some users may think the ‘x’ does something to return to the previous state and click on it to find that their application has been deleted. The software has many arbitrary or confusing affordances that do not constrain what the user can do, so the actual affordances can be unintended by the user. For many simple physical devices, the feedback is natural and unambiguous, allowing the device’s affordance to be clear.

Thomas Yun - 1/30/2013 12:29:50

I find some overhead projector screens to be fairly hard to use. By use, I mean locking it down in place or retracting it back up. I think it isn't designed that well because of the difficulty of using it. I often see people struggling to ether lock it in the down position or trying to retract it. Sometimes, I see people spend as much as 5minutes trying to get the desired result.Although it has a simple up or down motion to pull it down or to let it go up, I still becomes a pain to use. I suppose in this case, there are 4 functions for two motions/actions. There is the pull down, release up, lock, and release lock. I'm not completely sure whether this violates the mapping or feedback principle. It possibly violates the mapping principle because there is no clear motion on how to lock the screen in the down position but it can also possibly violate the feedback principle because there is no clear indication whether it successfully locked or released. (Some people pull down and release only to find it going all the way back up). One possible way to fix this problem is to have the bottom of the screen hook onto something. That would be a fairly simple solution.

I think whether there are any differences rely on the type of software but I would say there aren't. In software, if we see a button, we know that we can click it and it'll do something. Or for any of the word processing software, we know that if we click on a specific function, it would perform that function. Basically, we expect a button or option to do something, and when he click it, it does just that. Perceived is the expected function and actual is what happens or what we can do with it. For example, the spell checker is for checking spelling, so we click it for spell checking.

Derek Lau - 1/30/2013 12:37:53

One example of a physical device with bad design is the microwave. Many microwaves operate in different ways, with no universal standard. On some microwaves, you can enter the time you want it to cook for. Others you must press "Time cook" or some variant of that, which gets confusing because there is also "Time" and "Timer" on certain microwaves. Yet another questionable design decision is the inclusion of a "Minute add" button, which allows for quick addition of minutes in succession when pressed, but can also suggest to the user that this is the only way to add time to cook. This design violates the concept of providing a good conceptual model. The functions are visible, even if there may be an overwhelming and unnecessary amount of them, so visibility is not violated. Because there are so many ways to access one cooking function, the conceptual model looks like a large branching tree, with all the possible methods of cooking being the branches and leading to one result: the turning on of the microwave and the cooking of the food. I would re-design the microwave to be much simpler, with only one way to cook: enter in the time and press start, leaving out Minute Plus, Time Cook, Defrost, and Quick Settings (like for popcorn). This offers a straightforward conceptual model, because the main function for which people use microwaves is to cook food using a time. Its auxiliary functions, serving as a timer (without cooking food) and setting the clock, can have separate buttons. This allows the conceptual model to be simple, and its functions to be visible, where its only "invisible" function is the main and obvious function.

The difference in affordances of physical devices versus affordances of software user interfaces is that the means of interaction with affordances of software user interfaces is limited. With software, the user has only a mouse, and with the rise of mobile technology, their finger, and a keyboard with which to interact. Additionally, software has no physical properties and cannot be observed from a 3D plane, rather it is displayable only on a 2D, electronic interface. On the contrary, with physical devices, users can use their entire body to interact with the device. The difference between "perceived" versus "actual" affordances is that the actual affordances are the properties of the device that can be observed with the human senses, as opposed to the perceived affordances, which are properties that are assumed that the device has when the user constructs a mental model of the device and simulates its use.

Alex Javad - 1/30/2013 12:43:05

1) I went to use the restroom the other day in Stanley. There were only 5 people who wished to use the urinal... but ONLY ONE person could use the urinal... because there was only one urinal. Why was there only one urinal? There was only space for one because a giant wall protrudes from out of one of the walls... partitioning the restroom into 2 sections... for no apparent reason. There's just a wall there. It is very cramped in that room and does not allow for easy mobility. If I were designing this restroom... I would take out that wall, move the sink, and put in at least one more urinal... thereby increasing efficiency of restroom usage as well as increasing mobility to get in and out of the restroom.

2) We can draw many analogs between physical devices and software user interfaces. A physical button affords pushing... just as a software interface for a button affords "pushing" which we know to either click or tap with our modern day devices. But of course there are differences in the affordances of physical devices versus affordances of software devices! Physical devices might afford things that software user interfaces cannot afford by nature of the fact they are not physical. I want to see somebody write software that will hammer a nail for you. Hammer affords hammering. Software affords other things that software has the capability to afford. Moving on... when Norman talks about "perceived" versus "actual" affordances... he is trying to communicate that a certain object may be perceived to have a certain intended usage... when in actuality it might have a different intended usage or purpose. If it is not clear what a device affords, then Norman argues that the device was designed poorly and does not offer any "appropriate clues" as to what the device affords. So in this context, that might mean complicated software which you think affords one thing, but in actuality does something else. Or physically, a door which you think ought to be pushed open, but in actuality you must pull it to open it.

Kayvan Najafzadeh - 1/30/2013 12:59:11

Remote Controls for Comcast TV :), I don't have any problem with the latest technology in electronic devices and I can figure my way around very quickly but couple of days ago I tried to do couple of changes to my brother's TV and the Comcast's remote control and the user interface of their devices are so confusing. you have to do a lot of work to put a channel in favorites list or to switch to a favorite channel. I believe they have violated the principle of design for understandability and usability. because I remember in past there were 4 arrow bottoms on the remote, up and down were for switching the channels and left and right for increasing or decreasing the volume. now these bottoms are still on the remote but there is also a separate channel keys and volume keys!!!! I would have made it so that these arrow keys work as before but when in menus they will act as a selector. In one sense there are some affordances in physical devices which can are not in software UI, as its says in the chapter, people are already familiar with many things in physical world like knobs are for turning and balls are for throwing but in software UI people are very new to the UI and are not very familiar even though after couple of years that computers are around some things have become affordance as well like "x" is a symbol for closing the application. on the other hand in computer UI we have the power to make anything very easily, for example we can change icons rapidlly and come up with the most affordance one but this is not very easy in physical world. I believe what he meant is that the perceived and actual property of the thing should both be same thing. For example if there is a door knob that you should pull to open the door, this makes the perceived and actual property two different things one to turn and the other to pull.


Arvind Ramesh - 1/30/2013 13:02:23

One thing that immediately comes to mind is the godforsaken plastic packaging that most electronics come it. There is no easy way to open it, and usually no clues are provided on the packaging itself. I usually open these by taking a knife and awkwardly trying to pry open some section to get to what is inside. I have heard there are about 1000 hospitalizations each year from the opening of these packages, and I can see why. The manufacturers of electronics should make make instructions on how to open these packages and convey them on the package itself. One way I would re-design this kind of packaging would be to have a label in big black letter that said "Cut along here with a knife". Some companies do this, but sadly they are the minority.

In the broadest sense, there really aren't many differences in the affordances of physical devices and software interfaces. The user should know what to do with minimal instruction and should not have a difficult time completing the tasks for which the physical device or software was intended. However, there are some more steps that software interface developers have to take, in that a user using a piece of software will probably use it in conjunction with some other software as well. The software interface designer should keep in mind the OS commands the user is used to as well as the functions of other software that might be used. When Norman talks about "perceived" and "actual" affordances, he is talking about how people have a mental image of how something should work, and if the actual device does not perform in this way, there will be problems. For example, if your new mail app cannot sync with your calendar app, and the old mail app could, you are not filling the users perceived affordances of the mail app.

Nadine Salter - 1/30/2013 13:02:50

== Laptop ports ==

Laptop ports seem to have universally bad design — both in layout, and in specific port design.

Most current laptops have an array of ports on either side, the organisation of which varies from model to model: if you want to connect a device, you either have to rotate the computer to inspect every side to find the port you need, or blindly shove the connector into any port that feels appropriately-sized. Failing that, you may need to get an adapter (e.g., Mini-DisplayPort-to-VGA connectors are required to connect a modern Apple notebook to projectors on the Berkeley campus), which adds its own set of complications...

Connectors, like floppy disks, don't have an inherently obvious "right side up". Some connectors avoid this problem by being reversible (MagSafe) and others have a clear asymmetry in the port design (HDMI), but there's one everpresent offender: USB. USB ports are symmetrical in shape, but have a recessed plastic obstacle — if you insert a connector the wrong way, it will only go in partially and you'll have to try again the other way. (Most USB devices have a logotype on the "top" side ... but this doesn't help on computers that have sideways USB ports, or on laptops where the USB ports are "upside down"!) Similarly, most computers have a headphone jack and a line-in jack, both of which take identical 3.5mm audio connectors; if you blindly insert your headphones into the wrong port and start playing music in a quiet space, you will be in for a surprise!

These offending connectors have problematic affordances. A USB port seems like it should be reversible, since it lets you start inserting a connector in either direction; a 3.5mm jack seems like it should intelligently handle either a microphone or headphones (and in fact, current MacBooks do merge both ports into one).

In a perfect world, every peripheral would support some sort of universal port with a clear right-side-up, and you could just insert connectors into any free port — or, better yet, a high-speed wireless connection, analogous to Bluetooth but orders of magnitude faster and drastically less difficult to configure. Realistically, indicators on the top of the laptop highlighting which connectors go where would allow users to quickly identify what ports are available, and physically separating similar input devices (say, audio in jack on the left port array and audio out jack on the right) would avoid awkward mistakes.

Physical versus software affordances

Physical devices have physical affordances: you can, in real life, break glass or carve wood. Software user interfaces can emulate physical objects and take advantage of their affordances (skeuomorphism), though the emulation is imperfect — one can swipe a tablet screen to turn the page of an ebook, but one cannot dog-ear an ebook's page and find it again by feel. Perhaps Norman's "perceived" affordances refer to these additional bits of functionality that skeuomorphic software implies exist, but are not among the "actual" featureset.

Harry Zhu - 1/30/2013 13:15:34

1) One device that I used with a bad design was a radio alarm clock. Setting the alarm was really hard to figure out without the instruction manual because the button to set the alarm was not clearly visible, breaking the "make things visible" design principle. If I were to redesign it, I would have a designated, labeled button to specifically set the alarm.

2) There aren't any differences because like physical devices, software user interfaces both have perceived and actual properties about them. When Norman mentions "perceived" versus "actual" affordances, he means what you think something can afford to do rather than what it can actually afford to do. He uses the example of a chair, where a perceived and actual affordance is both sitting.

Jin Ryu - 1/30/2013 13:18:59

My digital watch had a relatively high learning curve in the beginning for me to use than a traditional watch with hands. It has more functions (alarm, stopwatch, calendar, different ways of showing the time), but it took about a few days of figuring out what the buttons were for (mainly after consulting the manual a few times) and a week or so to get used to them. The watch has good feedback since it shows the change exactly when you press a button, but the conceptual model is very abstract. Its buttons also violate visibility and mapping principles. One button each can perform several functions, except for the single most obvious and easiest to use: the light button, which doesn't do anything else but brighten the screen in the dark. The other buttons (start, reset, and another one whose label has been erased due to wear but was possibly "mode") are also used to set the time/alarm and etc, despite their name correlation to mainly the stopwatch function. In changing the display, "start" adds (never subtract) to the time (and if you add enough, it can switch from army time to am/pm display) and "reset" changes the digit or the day or the week (again if you press it enough times, and if you miss it, you have to loop through the entire cycle again.) I still don't know how to turn the alarm off on my stopwatch, or if it can even be turned off. I would re-design this watch to have at least 2-3 more obvious buttons for the multiple functions this watch provides, especially since there is an extra fifth button on my watch but am not sure what it does. It seems to provide light as well, except there is already a light button. For a better interface, there would be four buttons on the right: up (+), down (-), on/off for alarm (start/stop for stopwatch), and reset (for stopwatch mainly). Four buttons on the left side of the watch may also help if they had one function each: stopwatch, alarm, time set, and light. The digital watch would also be easier to use if it had a better graphical interface (like being able to choose a certain function by a list as if on an iPod), especially if there is a constraint on the number of buttons it can have. In that case, a circular scroller too around the watch may be easier and more intuitive to use (such as when moving the finger clockwise would result in increasing the time clockwise) than regular buttons.

There are some differences in affordances of physical devices versus those of software. Individual, physical devices being more tangible and solid are more likely to not change over time whereas software can be perceived as being more malleable. New conventions arise because of software, such as the X button is to close a window, a clear box with a blinking line means a place to type, etc so that affordances in software can arbitrarily change with how common or standard others use the same property in other software whereas most physical affordances are restricted to natural laws. On the other hand, the virtuality of software can mean its affordance is more limited by the interface in which the user can interact than physical objects; for example, a certain program may only be used and accessible through its given interface (like mouse and keyboard) whereas a simple paper clip can have anything done to it that is possible like being hooked into a chain, tossed around, unfolded to make a wire, and other physical events instead of just holding paper together like it was designed to do; the perceived affordances of physical objects to humans may be wider and more general than software (which are usually listed by purpose.) However, affordances with physical devices and software can definitely cross over within each other (such as a button shaped image can translate to the same affordances of a real, physical button or vice versa such as fake computer screens in displays or toys.) Norman means by perceived versus actual affordances are what the user thinks of the accepted or stereotypic range that the application will be mainly used for (perceived) rather than the other wide range of possibilities it CAN be used including creative misuse (actual.)


Jian-Yang Liu - 1/30/2013 13:19:28

1) The washer violates two of Norman's design principles: feedback, and consistency. We may simply consider the difference between my washer back home and the one that I use here in Berkeley. Even though the washer back home has plenty of options, none, except several that I use frequently, I understand. They might be for different ways that clothes should be washed, but the options themselves don't provide an understanding as to why they are there. Further, one washer made by one company, can even have different versions, leading to an inconsistency that requires me to look through and understand again. The design inconsistency isn't really a large problem, as the buttons are all clearly labelled. With regards to feedback, there should be a short explanation as to what each button is there for.

2) There remains some difference in affordances of physical devices versus software user interface. With physical devices, the average person has already created a mental image as to what such a device has been used for, and thus whether this physical device has been translated to some other device (as in the glass used to panel shelters) or not, the "perceived" affordance remains. In this sense, Norman means that the material may be for some other affordance (as a wooden table), but we know that the material itself has been used for something else (wood, for painting on), and so there is still a "perceived" affordance to the material. When looking at software user interfaces, however, it is more the opposite. There is an actual affordance that the item is built for (Notepad for writing notes), and then there is the perceived affordance that evolves from the affordance (writing code, perhaps), which the interface can indeed do, but is not ideal for.

Oulun Zhao - 1/30/2013 13:26:09

1) A backpack that has both a zip-zag and a magnet button. It violates the affordance principle of Norman's design principles because people tend to get confused by which one to use to close his/her backback. I would make the button only decorative rather than actually magnet so that people know only use the zip-zag can the button is not for closing backpack.

2) Yes, there are differences. For example, users has limited actions to take on for a software such as clicking and mousing over. However for physical devices people can take on a variety of actions including pushing, pulling, bending over, throwing etc. Norman meant that people might perceive the affordances of the product differently than what the product was actually designed for.

Monica To - 1/30/2013 13:27:28

One example of a physical device, or as Norman would call an "everyday thing," is a device that some of my friends and I came across this past weekend when we drove a rental car up to Lake Tahoe. The rental car that we used for the 3 hour drive up was a relatively new car; it was larger and much more modern than any car we were used to or ever exposed to. It consisted of many features, functions, cool ambient lights, and of course, a copious amount of buttons. It is certainly a positive thing that the car manufacturers are updating car features and keeping up with today's technology, but if five computer science educated people are unable to figure out how to adjust the temperature system of the car, there is definitely something wrong. The car's navigation panel in the front portion of the car had one large touch screen that had multiple dashboards; options for temperature, entertainment, navigation maps, and other additional functions. And surrounding the touch screen were about four panels of buttons and knobs all with symbols that we were all unfamiliar. This of course made the car look very stylish and cool; the car's front lit up with multiple colored lights and a very cool looking touch screen. Although the dashboards looked aesthetically appealing, we had no idea how to do anything with them. My friend, who was driving, had put a CD into the CD player earlier and struggled for about 10 minutes to find the eject button; he had to navigate through about 8 views from the home view of the touch screen to find it. Norman gave an example in the reading about the panels of eight glass doors and how his friend found him stuck between them. The doors were no doubtably very stylish, but they lacked the intuitive sense in the design. The car's control dashboard panel in the front of the rental car was just like that. Beautiful, but incredibly puzzling to use. The problem that I described with the car dashboard and ejecting of the CD violates Norman's design principle, Visibility. The eject button was hidden behind menus and menus on the touch screen display. We had to navigate through multiple options and menus to finally find the button. It wasn't visible immediately and the path to get to the button was also not immediately visible. There are definitely many differences in affordances of physical devices versus affordances of software UI. Physical devices can have physical affordances like, the holes for fingers in a pair of scissors, while software UI has slightly difference kinds of affordances. UI affordances would be something that is eventually learned and taught to us and is perhaps, not as naturally intuitive for us. UI affordances, like large buttons, swipe gestures to control things, text boxes, etc. were all things we were trained to do and now has become "intuitive" for those exposed to technology. But physical affordances like helmets, where we intuitively know to place on our heads or gloves to place over our hands seems to be a more innate sense of human intuition. Norman describes affordances to be clues to the user of how to use the device correctly. He uses the example of glass and wood to describe perceived and actual affordances and I thought that was a good explanation of it. Glass is a physical material that is naturally associated to shattering, fragility; this would be its actual affordance and its perceived affordance would be smashing; so people like vandals would naturally want to smash the glass to make it shatter while for wood, they would carved and paint on because wood does not shatter into pieces.


Brett Johnson - 1/30/2013 13:33:27

1) One physical device that I have used that exhibits bad design is the shower head in my bathroom. It violated Norman’s second design principle-”make things visible.” While the conceptual model that the designers came up with is reasonable–turn the dial to switch between different settings–there is no indication of which setting the user is changing to. So, the user gets no visual feedback from the dial, and is left turning the device until they finally get to the setting they wanted. While the user can tell a difference in the pressure or stream, they must go through each one before reaching the desired setting. The fix for this design would be very simple. A small icon that indicated the pressure and stream of the setting could be placed at each click of the dial. After the user found a setting that they liked, in the event that someone else changes it, they need only to remember the icon that represents their preferred setting. As long as the icon unambiguously communicates its setting, this would likely be very easy for the user to remember.

2) Both physical devices and software user interfaces have affordances. some of which are the same and some of which are different. Buttons in both physical and software environments are seen as being used to control something or to effect some change in a system. Sometimes physical device affordances are more apparent that software ones, maybe due to texture and other real life attributes of physical things. This could be why many interface designers resort to skeuomorphism in their design. I think that what Norman means by a “perceived” affordance is one that someone thinks a device has based on their conceptual model of the device. For example, if a music player has a dial on it, people will naturally think this affords volume control. It would only be an “actual” affordance if this dial controlled the volume. Usability issues arise when there is discrepancy between the perceived and actual affordances.

Ryan Rho - 1/30/2013 13:35:19

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

The clipper cards of buses in South Korea and Singapore are inconvenient because you have to tap the card when you both enter and take off the bus in order to calculate how much the passenger has to pay. The card devices are attached to both the entrance of the bus. This is inconvenient because the system delays the whole bus system because all the passengers have to tap their cards, considering that more than 90% of passengers use clippers and each bus is almost always packed. In addition, it's costly to put all the card readers in every bus.

It would be more convenient if passengers tap their cards in a bus stop. If each bus stop has a card reader, passengers don't have to take time to tap their cards in the entrance and exit of the bus, so the bus doesn't have to wait. This results in faster departure and arrival of buses.

Some people may doubt that some passengers will free-ride the bus by not paying. However, if each bus stop has a line where you can get in by paying in the entrance, you can prevent free riders, which is like a line in an amusement park. When it comes to tapping a card upon exit, we can force customers automatically pay much if they don't tap the card after they exit.

As a result, not only the bus system will be faster and more accurate, but also will save money and become easier to manage card readers.

It is hard to tell which of Norman's design principles did this device violate because the method above resolves overall delay by changing the design of the process rather than the design of a device. However, if I point out one design principle, the current clipper card system lacks mapping. Passengers pay at one bus stop and take off at another. When people think about taking a bus, they think about which bus stop to go. In order to make the payment system more intuitive, putting a card reader at each bus stop makes sense.

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

There are differences in affordances of physical devices and software user interfaces. The affordances of software user interfaces depend on the devices that display the software. For example, smartphones encourage their applications touch-based affordances, while computers assume that users interact with physical keyboards and mice. For example, smartphone applications afford users to slide up or down with their fingers to scroll the contents, while computer applications afford to click up or down button of the scrollbar to scroll the contents. On the other hand, physical devices have their own affordances already, so they do not as heavily depend on other devices.

"Perceived" versus "actual" affordances are mentioned because the "perceived" affordances imply "actual" affordances, affecting the way the users interact. For example, a "perceived" affordance of glass is to break, so some people want to "actually" break it by understanding its "perceived" affordance.

Eric Leung - 1/30/2013 13:53:50

A badly designed printer, especially the large office ones, are one of the most horrible things to use. With literally over 30 buttons, and a touchscreen with different inputs into it, there are way too many options to know what to press to do what. Like the washing machine in the reading, the inputs make it seem like a spaceship control room. This violates the principle of affordance and mapping because for simple uses, a printer does not need so many inputs. Also, the feedback for wrong actions is horrible; if I were to accidentally make the wrong number of copies, it would immediately start spitting out stacks of paper, and I would have to figure out how to stop it. If I were to redesign a printer, I would do away with almost all the buttons and only use the touch screen and maybe 2 large buttons for copying and scanning.

There are differences in affordance between physical and software devices because of what we are used to seeing. If we see a knob, we turn it; but only because we grew up seeing doorknobs and turning them. On the other hand, if we see an X in the top right or top left corner of a computer window, we know it means to close it, because we have been around computers. However, for a first time user of a computer, there's no such indicator to say that the _ means minimize, a square means maximize and a X means to close. So perceived affordances in terms of software could mean that we know what each button or scroll bar does because of what we have learned, while actual affordances could refer to the buttons on the keyboard, the mouse or the monitor, where an action can easily be observed.

Juntao Mao - 1/30/2013 13:54:12

1. Light switches in lecture halls are examples of bad design. It violates Norman's design principle of having good visibility to indicate mapping. While each light switch only correspond to one light, and feedback for performing actions are instant, most switches aren't labelled and therefore the user often have to try out many switches before turning on the right one. There is also no intuitive pattern to the order that switches map to the lights they turn on. A Simple re-design is to label the switches/ provide switch to light location mapping. Or design light switches such that their positioning on the switch board corresponds to the light locations. 2. Some differences in affordances of physics devices versus affordances of software user interfaces is that:

 a) General public may have less pre-made assumptions about software interfaces than actual objects of daily life that they have interacted with for a long time.
 b) Software users have, with possibly just one screen, have less to base affordances on than physical devices. 
 c) Visibility of Software interfaces is often less than physical devices, (ie, when you pull on a door handle, it opens, vs. you click on a button online to open *smart* door, a lot more actions are taken in the computer before something happens). Also, because the general public lacks the ability to understand the details of software, they would often blame coincedents on software errors.

Perceived affordances is what the user think the object/software is intended to be used for, and actual is what the designer meant it to be for. Perceived affordances may often be wrong; however, a good design should have matching perceived and actual affordances.

John Sloan - 1/30/2013 13:59:45

1) I think in general, TV remotes having pretty terrible design. Specifically the more recent remotes that try to accommodate for every device you own. As an example my Uncle just bought a universal remote with a small touch screen, that controls their Wii, TV, sound system, DVD player and more. The idea seems really cool, infusing the trendiness of touch screens with a traditional remote. However, it is trying to do too much and is simply impossible to use without hours of practice. Several times I hit the wrong button and the surround sound starts blasting music seemingly forever because I can't figure out how to turn it off. The remote is lacking visibility. There are at least 30 buttons and the touch screen has layers and layers of different screens for different devices. Its pretty hard to find what you are looking for without blind experimentation. There is no mapping to what might lead you to the right place. The remote does have a decent conceptual model since the design is clear which way it should be oriented in the hand and its easy to discover the point and click dynamic that all remotes share. In short, my re-design would be to make it much simpler and also lose the touch screen. The touch screen doesn't add much and is incredibly inconvenient because you have to look away from the tv in order to use it and it is too easy to accidentally touch by laying on it. The paradox of technology.

2) There are differences in affordances between physical devices and software. The big one is that a lot of affordances are perceived from the material that a device is made from. For example, the iPhone seems sleek and fancy because of the perfectly clear glass front. It is also natural to touch and write on because it is such a flat surface. But software is not made from material and does not have a 3D shape so its affordances are perceived in different ways. For example, without doing anything it was clear that this box was meant to be typed in. Norman mentions "percieved" vs. "actual" to make it clear that affordances give off a perception of how things are meant to be used, and they should hopefully match the actual use it was meant for.

Shaun Benjamin - 1/30/2013 13:59:47

When I was growing up we had a tall floor lamp with no visible switch or knob to turn it on. In order to turn it on, you had to twist the top half of the support rod clockwise, which was not obvious at all. If the rod wasn't twisted far enough, the light would sometimes flicker on and off, and if you twisted too far counterclockwise when turning it off the lamp would come apart. This violated the principle of visibility, as it was impossible to tell how to turn the lamp on by mere inspection. It also had a flawed system of feedback, as sometimes the light would turn on, indicating the lamp was in the on position, but as you walked away it would begin to flicker because it had not been turned far enough. A simple solution to this problem is to have a knob or a switch to operate the lamp rather than the unnatural twisting that could dismantle the lamp.

The idea of affordances of physical devices can be applied to software user interfaces. If you open up a program and see a button that says "Print", it would appear that program affords printing something. However most programs do more than one thing, so it is possible that many of the functions of a program are not affordances. For example in a word processor, it is a perceived affordance that we can type on our keyboard and the result will display on the screen, but there are many actual affordances hidden in the menus and keyboard shortcuts.


Glenn Sugden - 1/30/2013 14:04:44

1) A super-fancy mechanical chair that had three levers under the seat, knobs at the end of two of them, and a squeeze-grip lever attached to third. Quick, what do they do? If you said "adjust the chair!" you'd be just as correct and just as vague. Apparently the levers control: the height of the chair, the (baffling) ratio between the back and the seat angle, and a (very handy) lock in the direction of the facing angle. One knob controlled the "looseness" of the attached lever (which is typical for adjustment levers and locking them down, I guess), but the other knob .. didn't appear to do anything (broken?). And the squeeze-grip lever? To lock the height of the chair adjustment, of course. The affordances given by all of the adjustments swamped the needless complexity that they introduced. The conceptual model of an adjustable chair is fairly commonplace, but the variety of knobs, levers, and grips completely broke down any intuitiveness that a user might have. You had to squeeze, pull, adjust and spin to discover how each of these gadgets worked, as I doubt that even a user manual would have been of any use (none of the levers, besides the one with the squeeze-grip lever attached to it, gave any indication what it might do, let alone the knobs at the end of them). Besides, who leaves a user manual attached to a chair?!

- The fixes: steal the obvious, pneumatic vertical lever for the height - don't provide an option for "locking it tighter." Steal the obvious pneumatic lever for the seat back angle, decoupling it from the seat tilt, and don't provide a "tightening knob" either. The seat tilt doesn't seem to be important (an informal poll of 4 of 4 people :-). The facing direction appears to be somewhat important (2 of 5 wanted it), and a separate foot pedal to lock / unlock the facing direction swivel might be appropriate. Overall: put the control on the angle adjusting, don't "tighten" anything (and just binary lock it), and keep things as simple as possible while allowing some comfort / customization.

2) I think that there is a trade off when virtualizing a device and its controls. You are allowed a greater number of freedoms once you can disconnect physical reality from a control (E.G. A calendar view in which you can instantly see one day, a week, or a month's worth of material with the click of a button - something that would be impossible to do in the real world (well, OK, maybe you _could_ record all of your appointments in 3 different calendars .. but ..)), but now you have a level of indirection between the control and the outcome (E.G. A mouse to flip a page in a virtual book assumes that you already know how a mouse works and you have set up the gestures to be "natural" in the first place).


Minhaj Khan - 1/30/2013 14:10:32

one object with bad design i have to use is the heater in my room. it has a knob which turns to increase the intensity of heating, but it doesn't have any marking of how far that knob is currently been turned. it has no grooves or lines next to the knob to indicate its current position. this violate's norman's design principle of feedback, sending back information to the user about what action has actually been done. in this case, its not possible to tell at what intensity the heater is currently performing, which is a grave lack of feedback in this design. the way i would fix this design is to have a groove on the knob itself and notches on the side of the knob to indicate where the intensity of the heater currently is. another way would be to have a digital display which indicates the heater's position or intended temperature.

there are some differences between the affordances of of physical devices and software UIs. As Norman puts it, balls are meant to be bounced, knobs turned, switches flipped. These are affordances that people have grown accustomed to and which usually have apparent and reliable physical responses. The affordances of software UIs differ in the sense that their function may not always be apparent. There are a few handles, like UI buttons and sliders, that have more obvious affordances, but with many software UIs this isn't always the case, there the function meant to be performed isn't obvious. This is one of the reasons some elderly relatives ask me how to proceed to the next page or why a button isn't doing what she thinks it does when processing an online form or doing some other electronic task. Norman mentions perceived affordances as being what the expected outcome of the action would be. Actual affordances may differ in cases where there is a misconception of what the affordance of an object is, physical or software UI. when there is a big discrepency between perceived and actual affordances, it is a sign of bad design, especially in case of software UIs.

Weishu Xu - 1/30/2013 14:12:10

1) One example of an everyday physical device with bad design is the automatic flush toilets and automatic paper towel dispensers that are in many nicer public bathrooms. Quite often, it is very difficult to determine what exactly sends the device a signal to operate, leaving individuals waving furiously for a towel that doesn't arrive or feeling the flush immediately upon placing the toilet seat cover and sitting down. This violates Norman's design principle of mapping because the movements in the physical world do not directly control the results that you would expect.

In order to address this problem, I would examine the issue the design is trying to solve: for example, individuals do not like to flush toilets with their hands for sanitation reasons, and many others forget or choose not to flush. For a simple solution, I would install a foot pedal for the toilet to flush or a foot pedal for the dispensing of the paper towel. This in itself might lead people to choose to flush because they no longer have to touch the toilet handle with their bare skin. To address the issue of individuals who choose not to flush the toilet, it would be possible to use today's existing technology, but install a significant delay of maybe a minute that would likely guarantee the individual had indeed already left.

2) I think the affordances of physical devices and the affordances of software user interfaces are comparable. For example, I believe affordances are developed based on the expectations we have based on past experiences (i.e. through the doors, slots, etc that Norman mentions in the text). We have expectations in software interfaces as well, such as pushing buttons, clicking boxes, scrolling, etc. Perceived affordances are things that individuals would assume an object is used for while actual affordances are what the object was designed to do. In order to implement good design for software and make things "intuitive," we should understand what the perceived accordances of an object is in order to design the actual accordances. We need to clearly have users expect how to use an object in order to minimize problems and improve simplicity of use.

Soyeon Kim (Summer) - 1/30/2013 14:14:15

I believe that a physical mouse is an example of a physical device with bad design. Almost everybody uses a computer daily nowadays, I consider it as an “everyday thing”. A typical mouse is not ergonomically designed; it damages wrists of people from using it for a long period of time and it could lead to a Carpel tunnel syndrome. I remember when I had to use Photoshop for intense graphics work, my wrist and fingers were so painful from repetitive strain of the muscles.

I would redesign it so that the mouse allows our wrists to be neutrally positioned, allow grip on the full palm (avoiding fingertip or claw-like grips which can hurt fingers and change the wrist position uncomfortably), and provide a bump on the back end to rest. Actually, there are now many ergonomic computer mice that prevent this problem.

There may be some differences but I can mostly think of similarities in affordances in physical device and software user interfaces. This is because lots of good software user interfaces mimic physical reality. For example, sticky notes on Mac computers look exactly like the ones in real life, and therefore users get the same perceived affordance for both, while their actual affordance may be different since one’s actual affordance is only within the computer (since they are virtual sticky notes), and the other is physical and real.

In this context, perceived affordance and actual affordances can be distinguished; when we first see an object, we perceive what the object would do and build a certain expectation for that object. Actual affordance is what the object does actually regardless of the perceived affordance.

For example, let’s say there is a prop sink that does not actually work. A person would see the sink and think that he/she can turn on the water and use it just by looking at it, and this is perceived affordance of the sink. On the other hand, the sink actually does not work since it is only a prop. In this case, actual and perceive affordance did not match to one another. Good design should try to match the perceived and actual affordance.


André Crabb - 1/30/2013 14:18:16

1. One example of an every day thing that I can think of that doesn't have such a good design is the Mac keyboard. Of course some could argue that everything Apple makes is beautiful, but I would argue that their keyboard, and the software behind it could use some work. Apple makes things that they claim "just work." And sure, their keyboard works fine for basic things, but when you try to do anything a bit more complicated, it gets tricky. To be specific, Apple has changed the mapping of functions to keyboard shortcuts on they devices compared to other, more popular, systems. Without going into too much detail, Windows (the most popular OS) keyboards/systems support three special keys: Control, Alt and Shift. (plus the "windows" key, but that really just has one function). With Control, Alt and Shift, you can perform many functions my combining a subset of the three and adding in another button. For example, one of my favorites is Control+Backspace, which deletes a whole word. On a Mac, the combo is Alt+Delete. Its a subtle change, but I would argue that this goes against the "norm" of keyboard shortcuts. Not to mention that the whole function is not "Visible" at all. This is only one small example. Another quick one I can think of is looking at your history in Google Chrome. On Windows/Linux, the key combo is Ctrl+H. On a Mac its Command-Y. I would argue that using 'H' is more natural, and even more visible to a user's mental model of keyboard shortcuts. To fix this, I would design Apple's keyboard shortcuts to align with what seems, to me, to not only be more natural, but to what makes more sense.

2. I would say yes, there are differences in affordances. Physical devices have "fundamental properties" that are easy to spot and understand. Take Norman's example of a chair. Though he doesn't go in detail about "perceived" affordances much, I would think that software interfaces would be harder to perceive than actual devices, mainly because I'm not sure what the "fundamental properties" of software are. One thing I can think of would be an app's icon. Looking at an icon, the user might imagine, or perceive, what the functions of that app are. The "fundamental property" of software is code, whereas the "fundamental property" of physical things are their materials. How things things are put together make a great impact on a user's perception. When wood is organized to look like a chair, a user perceives it to be a chair. However, a user cannot see, much less understand, the code underlying an app with a picture of an "S" as the icon. What are they supposed to perceive? Visibility plays a big role here, and I think that is one of the main differences between affordances of physical devices and software.

Eric Ren - 1/30/2013 14:25:47

1) I do not know if this only applies to me, but I have a really hard time using printers in general. One time I had a lot of trouble figuring out how to print double sided pages from my computer to the printer. Connecting the computer to the printer was easy enough, since there was a visible usb outlet. However I could not figure out how to press the right buttons in the settings to print pages in both sides. The printer had big, colored mode buttons such as Print, Copy, Scan, etc. In the middle was a small screen in the middle of the control panel, which displayed the current mode selected. There were also four buttons surrounded this display, and have different functionalities depending on what mode the printer is on. When I switched the mode to "Print", two of the buttons changed into navigation buttons, one of them became settings for the mode, and one of them had no purpose whatsoever. This device clearly violated the design principle of "natural mapping". I had to play around and press all four buttons to see which one went straight to the settings. The buttons themselves also aren't marked in any clear fashion, such as navigation arrows for the middle two buttons.

However, the screen itself did an ok job of giving feedback on what I did. Every time I pressed a button, the display would generally change words, so I understood what I'm selecting. As I mentioned previously though, sometimes one of the buttons will not have any functionality so pressing it did nothing.

If I were to redesign this control panel, I would add two buttons dedicated for navigation, and have each button perform some sort of function. Of course this will increase the production cost, and the user can initially become intimidated by amount of buttons. However as Norman mentioned in his story about the German bus driver, the driver found the buttons easy to use since every button mapped to one function.


2) Affordances of physical devices differ from affordances of software user interfaces because of the very nature that the physical devices are made from real world material. For example, Norman mentioned in the reading that glass is a material that is "for seeing through and breaking". In general, software user interfaces do not have this sense of physical affordance; their affordances are more abstract. For example, a lot of software programs have menus on the top of the window. When hovering over a submenu, there are usually small triangle icons on the right to indicate that if you hover over the option, there is another submenu.

While the affordances differ in the sense that one of them is 'physical' while the other is 'virtual', they both give hints for their functionality.

Norman most likely means "perceived" affordance as natural affordances and clues that we have seen in the past, and thus have a sense of what the functionality is. "Actual" affordances most likely means the functionality that the affordance is intended to convey, which might not necessarily be what the users had in mind.

Dennis Li - 1/30/2013 14:26:08

1. The rice cooker. The rice cooker is such a simple device that it's hard to believe it could ever be done "wrong". There are only two settings keep warm, and cook. Seems simple enough, but, alas there are so many issues! First off, there is no "off" mode. If you don't want the rice cooker to waste electricity, you actually have to unplug the device! Secondly, because of the lack of an off mode, you are usually too preoccupied looking for the switch, that when you do finally plug it in, you forget to start cooking! So often have family members, friends, and myself forgot to start cooking and had to delay dinner for another 30 minutes. I would redesign the rice cooker so that there were an off mode. Also then you switched it out of the off mode it would beep so that you never forgot to start your rice again.

2. Yes! Both affordances are in regards to the quality of the device. Typically the difference between physical device affordance is it's practicality in use. For example the teapot with the handle and spout facing the samer direction has a poor affordance, because the handle is not in a practical place. For software, the affordance is ease of use and how intuitive the user interface itself actually is. For software the affordance is also focused on making practicality the main concern. Perceived affordance can be perceived as what the affordance of a device is before actually be thoroughly tested. Once you really begin to use the device however, you may notice something lacking in the actual affordances.

Timothy Wu - 1/30/2013 14:31:29

1) Give an example of a physical device (an "everyday thing" as Norman would call it) with bad design that you have had to use. Do not think about software! Think about household appliances, sports equipment, cars, public transportation, etc.) Which of Norman's design principles did this device violate? How would you re-design it to solve the problem?

An example of a physical device that I had to use was the exit door in a public bus in San Francisco. This exit door was located in the middle of the bus near the back, and directly preceding it were two downward stair steps. The mechanism through which you could open the door was to step down onto one of the steps in front of the door for a few seconds.

First of all, this device violated Norman's design principle of feedback. When you step down onto the step, the step doesn't give you any feedback, like lowering, clicking, or giving you any indication that you are doing what is intended. You have to know beforehand that it takes a few seconds standing on the step before the door opens, or hear your annoyed fellow passengers yelling directions at you.

Additional things that this device violated were the concepts of mapping and affordance. When you look at the door, it has handles on it as if it were meant to be pushed to open. In your mind, you map the handles to the mechanism by which you would open the door. But this intuitive mapping is not correct, causing great confusion. A door with a handle affords being pushed. Steps afford stepping down onto as a path on your way out the door, not as a means for opening the door itself.

I would redesign this device by putting a button that opens the door or allow pushing the door to open it. Having a button that is conspicuous and that upon pressing it, opens the door, would be a great solution to this problem. Another solution would be to simply have pushing the door to open it. These two solutions are implemented to great effect in buses in Berkeley and East Bay.

2) Are there any differences in affordances of physical devices versus affordances of software user interfaces? In this context, what does Norman mean when he mentions "perceived" versus "actual" affordances?

Yes, there are differences in affordances of physical devices and affordances of software interfaces because physical devices have the benefit of having three dimensional form as well as a specific type of material that is associated with it. Norman mentioned that the material of the object has some bearing on the psychology of how you should interact with it, like his example of the glass window being shattered and the wood window being written on. Also, having a three dimensional form, with depth and curvature also lends itself to intuition about how to use the object, like Norman's example of scissors. In contrast, software interfaces are for the most part two dimensional and must rely solely on the visual perception of the user. The user can't touch the software interface, nor can he perceive how its depth influences the way it is to be used. Therefore, the affordance of a software interface must use more nuanced and clever visual techniques to create affordance.

In this context, Norman means that perceived affordance is what the user can see and how the user believes that the device is to be used. The actual affordance is how the device is actually to be used. For a software interface, perceived affordance is what the user sees that gives him or her insight into how the interface is supposed to be used. The actual affordance is what the actual purpose of the software is, like if the software interface is for a piece of banking software it would be used to access your bank account, etc.

Lemuel Daniel Wu - 1/30/2013 14:32:25

1. One of the worst designs of everyday physical devices that I interact with are light/electrical switches on walls - this includes switches for food dispensers in my sink, light switches, etc. These switches are almost never labeled, and when entering an apartment or house for the first time, it is impossible for users to know which switches to flip for the desired effects. This results in a lot of trial-and-error flipping of switches. This violates the principle of visibility, that the correct parts are not clearly conveying what they are used for. I would re-design these switches to either have a slot above them for users to write/insert labels regarding what the switches control, or even remove the switches altogether, and insert a touchscreen on the wall instead - one with a clear interface that allows users to change the on/off status of variance electrical appliances throughout the house.

2. I believe that there may be differences in the affordances of physical devices and the affordances of software UIs. This is because software is digital, and the damage that you can inflict on a piece of software is usually local and temporary; another person could always come along, download the source code from the internet, and repair the damage that you have inflicted on your software. Also, with software, it is unknown to people how much money and time was put into creating it, and so for most people, I believe software has very little affordance (especially when apps cost just a few dollars). Physical devices almost always feel more expensive, with higher perceived affordance (people feel like they are much more expensive and valuable) than software, which most people feel like is ok for pirating (believing it's not worth much).

Samir - 1/30/2013 14:35:10

1.) Household keys have always been a pain for me. I have trouble figuring out which way the key fits inside of the keyhole(up or down), and also figuring out which direction to turn(clockwise or counterclockwise). Sometimes the lock is bad, and needs a particular force in a certain direction. Regardless, there should be a standard protocol for which way keys should be inserted into a keyhole, and which direction it should be turned. Also, with time, the fine points on the key eventually wear down and will need to be eventually be replaced. Then again, most things wear out and eventually need to be replaced anyways. Another related problem is giving someone else your keys temporarily to provide them access to a certain secure location. Many keys look similar and it would be a really frustrating experience to try a bundle of keys one by one into a lock to see which one works.

The two design principles that I believe this device violates are visibility and paradox of technology. At the time keys were invented, which was 4000 years, there was not enough technological advancements to automate this through electronic devices. About 10 years ago, before mobile devices, or even automated devices, keys were simple, but no one debated them because there was a lack of available technology to make the experience easier. Now, with mobile devices, and automated unlocking(like for cars), it would be much easier to unlock based on proximity, or the click of a button. I would argue that these replacements for metal keys can avoid the Norman's potential issue of "too complex to use" because unlocking by proximity or the click of a button is much simpler than manually placing a key in a whole and turning. Visibility is also a design issue here because there is no way one can tell which key is meant for which keyhole by just staring at it, mainly because you cannot see the inside of a keyhole. This also a vulnerability, because if anyone could see inside of a keyhole, then it would be pretty easy to break in!

I would re-design keys by making a universal access application for all locks on mobile devices, where the user can choose to unlock by either pressing a button for a particular lock, that will only unlock when they are within X number of meters of that lock, or by allowing automatic unlocking when they are within X number of meters of that lock.

2.) I would say that there are currently some differences in affordances of physical devices versus software user interfaces because I do not see software user interfaces as developed as physical devices. User interfaces on the web are becoming increasingly better with immense user testing, but this has only started in the past couple of decades since the web has began. As Norman mentions "when simple things need pictures, labels, or instructions, the design has failed," I can immediately think of many user interfaces in software where there are immense instructions for working the interface. When Norman mentioned "perceived" affordances, he is referring to implication that users get when they see a device, and immediately understand how to use it without further instruction. One example in the software world is the three horizontal lines in mobile applications that implies the menu section. Actual affordance refers to what the device should be used for based on the creator of the device's desire, but does not necessarily mean that all users will get the same signal and it could fail.

Eun Sun Shin - 1/31/2013 19:35:40

1) My apartment's elevator has very poor design. Because it is an old elevator, it does not function like modern elevators that most people are used to. Therefore, visitors often get confused. No visible instructions are posted on how to properly operate the elevator (but most elevators do not have instructions). There is a problem with the mapping between what visitors want to do and is possible. For example, if a group of visitors enter the elevator wanting to go on different floors, each person will press the floor number they would like to get off at. Unlike most elevators, this elevator will take everyone to the first floor that was pressed (if 3 was pressed before 2, then the elevator goes to 3). Furthermore, the elevator forgets all other queries and stays at that floor until another button is pressed. Visitors that do not know this will wait in the elevator thinking that they are stuck or will be brought to some floor where someone in the hallway called it. This problem also reveals that this elevator gives users delayed feedback; people do not know what floor they are going to until the door opens. To solve the many problems of this elevator, I would get a modern elevator installed in my apartment complex. The modern elevator will be smart enough to take in multiple floor requests at a time, go to the requested floors in increasing order, and be much easier to use.

2) Yes, there are differences in affordances of physical devices versus affordance of software interfaces. Perceived affordances refer to how people think or expect something should be used, while actual affordances refer to what the product actually does. Designers have control over perceived and actual affordances with physical devices, but they have no control with actual affordances of software user interfaces. For example, the iPhone's screen affords touching, and when a designer adds buttons to the visible screen, the buttons are perceived affordances because the affordance of touching is independent of the buttons added.