Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
News Technology

How Today's Tech Alienates the Elderly 453

Barence writes "A UK academic has blamed unnecessarily complicated user interfaces for putting older people off today's technology. Mike Bradley, senior lecturer in product design and engineering at Middlesex University, claims efforts to be more inclusive are being undermined by software and hardware design that is exclusively targeted at younger users. He cites the example of the seemingly simple iPhone alarm clock. 'They're faced with a screen with a clock face and a plus sign icon, and they couldn't understand that you were "adding an alarm," so they didn't click the plus sign to get through to that menu. Pressing the clock image takes you through to choices about how the clock is displayed, and it's not easy to get back again.'"
This discussion has been archived. No new comments can be posted.

How Today's Tech Alienates the Elderly

Comments Filter:
  • by pspahn ( 1175617 ) on Friday May 20, 2011 @05:16PM (#36196516)

    Couldn't that also be interpreted as "necessarily simple"?

    Older generations don't get it not because of its complexity, but its simplicity. They might understand better if everything had a label and step-by-step info, but for the rest of us that do understand, this just adds complexity when it might not be needed.

    • by uncanny ( 954868 ) on Friday May 20, 2011 @05:23PM (#36196588)
      that's like saying calculus is easy just because you know how to do it, and someone more, ignorant if you will, would have to be shown how to use it. you grew up with computers, so you know the ways to manipulate a comptuer already. Todays OS's are VASTLY more complex than, say, DOS
      • by Runaway1956 ( 1322357 ) on Friday May 20, 2011 @05:49PM (#36196872) Homepage Journal

        Mmmm. Points to you for being "right" - but - you're missing something too. I'm rather computer savvy, I'm aging, and looking at a display of an alarm clock, I would hesitate to press the "+" sign to "add an alarm". It's a generational thing, I would guess. I grew up "setting the alarm". Later, when alarm clocks and/or watches had multiple alarms available, I continued to "set the alarms". Add an alarm? The terminology leads me to think that I'm going to add a new clock, or in this case, add a new interface for another alarm clock. I don't want another alarm clock - I want to know how to "set" the one I see!

        • Re: (Score:3, Informative)

          by Anonymous Coward

          I'm only 24 and I would never think of adding an alarm either, and probably wouldn't press a plus sign on a clock unless I was expecting it to show multiple time zones.

        • Well it seems obvious that the feature was designed around an integrated calendar perspective. Its not just an alarm in this context, its a scheduling event on a larger scale. I totally get what you are saying, I'm simply adding that its not JUST an alarm clock, its a part of a calendaring and scheduling system.
        • Comment removed (Score:5, Informative)

          by account_deleted ( 4530225 ) on Friday May 20, 2011 @06:22PM (#36197214)
          Comment removed based on user account deletion
        • by IICV ( 652597 ) on Friday May 20, 2011 @08:14PM (#36198180)

          I'm rather computer savvy, I'm aging, and looking at a display of an alarm clock, I would hesitate to press the "+" sign to "add an alarm". It's a generational thing, I would guess.

          I agree that it's a generational thing, but I don't think it's one in the way you're describing; in fact, I think it has to do with the fact that you would hesitate.

          See, the older generations grew up with computers as these big, fragile things; you couldn't fuck around too much, otherwise something might break and it would be all your fault. The generation before that grew up with industrial and farm equipment that was literally dangerous to touch; poke the wrong thing, and you might not have a finger afterwards.

          People from those generations are afraid of exploring, because they might accidentally change something and break the computer or lose a finger.

          That's not how we do things in modern interface design. The goal is, basically, to make exploration have zero cost; as long as you don't change some state that's visible to you in the program, you can touch buttons all you want and explore the menu structure without any cost.

          So yeah, there is a difference - you would hesitate. Someone ten or twenty years younger wouldn't. That's pretty much all there is to it.

        • I think that your comment illustrates a large part of the problem. Non technical people cannot imagine what features a program could have, since the features are becoming more and more abstracted from real-life metaphors.

          With older alarms, there were either 1 or a fixed number of alarms. You could see them and interact with them. With the newer alarm app, you can have an infinite number of alarms, and they don't exist until you tell the program to create them.

          You aren't "setting the alarm", you are creating

      • Comment removed based on user account deletion
      • by blahplusplus ( 757119 ) on Friday May 20, 2011 @06:45PM (#36197446)

        "that's like saying calculus is easy just because you know how to do it,"

        No it's not, many things are so easy already. I have grandparents that use their old age as an excuse not to learn new things. I really don't buy that old people are incompetent, when it comes to their own interests and things they like they sure do put in the effort.

      • by Kjella ( 173770 )

        *blinks* Are we talking about the same DOS I used to use? Seriously? The only reason you could marginally call current OSs more complex to use is because they do so much, much more. For example this whole "multitasking" thing, pretty easy when there's just one app right (oh I know there was some exotic ways to do multitasking, but mostly you didn't have the RAM anyway).

        I remember my first cell phone.... a lot less complex than my iPhone, yes. But all it could do was call and send SMS - I don't even remember

    • by foobsr ( 693224 )

      but for the rest of us that do understand, this just adds complexity when it might not be needed

      This would be 'adds complexity when it is not needed', but a prerequisite for understanding this is having a grip of everyday language. This might probably be too complex an endeavour if one is trained to live with 'simple' interfaces.

      CC.

      • by peragrin ( 659227 ) on Friday May 20, 2011 @06:42PM (#36197410)

        ah but language changes drastically every 10-20 years.

        There are more words in today's language for things that simply didn't exist 10 years ago let alone the number of words created for things that just didn't have names 70 years ago.

        70 years ago was 1941. Things like atoms were only suspected.(the atom bomb is less than 70 years old.) in 1953 came the double helix DNA. Transistors came in the 1960's.

        Stop and think you learned more words in grade school than your grandparents knew until you were almost born.

        The amount of knowledge that has to come with the understanding of those words while simple when first learning is huge. These are people who thought tv's in color was amazing, and that there was no need for more than one phone in a home, and that phone had to be spun to work right.

        UI's don't matter. The elderly will simply not use the devices. Soon enough they will move on and the new elderly will be a little more used to then. Advancements through attrition. it is ugly truth but nothing to get worried about.

        • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Saturday May 21, 2011 @02:04AM (#36200040) Journal

          70 years ago was 1941. Things like atoms were only suspected.

          What do you mean by "only suspected", here? Mendeleev's periodic table, arranged by atomic number, was in 1869. The electron was discovered in 1897.

          Maybe you mean artificial atomic reactions?

          in 1953 came the double helix DNA.

          Yet DNA itself was known about since 1869, and it was known to have a regular structure in 1937. In 1943, it was clear that DNA carried genetic data.

          The double-helix model is tremendously important to biology, but not at all important to the fundamental ideas of DNA that makes it a household term today.

          These are people who thought tv's in color was amazing,

          I'm sorry, but they still are. I'm 24, and it still blows my mind how much we're living in the future. I often wish I could give Isaac Newton, or, say, Benjamin Franklin, a tour of our modern world -- and the TV is the first thing that comes to mind.

          UI's don't matter. The elderly will simply not use the devices.

          UIs absolutely do matter, but I don't agree with the article here -- modern UIs are generally decent, and the biggest thing the elderly lack is the understanding that it's OK to poke at them and play with them to figure out how they work. When the motivation is there, well, they're not likely to be the ones jailbreaking or programming or anything fancy, but my grandparents (the ones still alive) have all at least learned to use email, because that's really important to them -- keeping in touch.

          It's just that the bar is a bit higher -- if they have an alarm clock that works, and they don't already magically know how to use the iPhone alarm clock, they'll go with the one they already know how to use instead of actually trying to learn the new one for a minute or two.

    • by rmstar ( 114746 ) on Friday May 20, 2011 @05:29PM (#36196658)

      Frankly, I think the problems with UIs are unsolvable. There is a point of diminishing returns, and after a while the returns become negative. And whoever is left out after that is a hopeless case.

      This is how it actually works in practice [youtube.com] (it's supposed to be satire, but, damn, it's way to accurate)

      • I think you're right. Reasonable people who have interest and aptitude can learn whatever interface they need to learn. Nowadays one gets the impression that most people hate computers, they want magic. I particularly loathe it when some user goes on and on trying to make sure I understand what it was they did wrong and why the result was unsatisfactory. After you tell me once, I can tell you what you need to do -- that's the important information the user needs to focus on. But so many otherwise intelligen
      • Unsolvable? You shouldn't accept such a proposition.

        Interfaces are finite and enumerable. There is only just so much functionality in even the most powerful app. The essence of an interface can be captured and expressed with logic. And we have all kinds of tools that can handle logic. We've been doing this for software for years. The "interface" of a programming language is much more open ended and complicated than a mere user interface, and we've been reasoning about language for decades.

        Yes, bad

    • by Culture20 ( 968837 ) on Friday May 20, 2011 @05:36PM (#36196734)
      The one thing I've noticed about "computer-stupid" people of any age group is that they're unwilling to click on anything unknown or just test something. It's like they've lost the capacity for experimental play and refuse to learn on their own.
      • Yes! About 20 years ago I was a code monkey for a small engineering firm, and the receptionist was forever having trouble with... oh, I can't remember what computer issue it was. So I took the "teach a man to fish..." concept to heart and tried to explain to her why her computer kept messing up and what she could do to fix it herself, and she interrupted me yelling "I don't want to know what's wrong, I just want it fixed!"

        And if I had a byte for every time I heard -- not just from novices but from tech

      • by DrgnDancer ( 137700 ) on Friday May 20, 2011 @06:02PM (#36197016) Homepage

        I think it's more fear that they'll break something. Most of us with at least a reasonable understanding of computers have realized that for the most part it's "safe" to play with computer settings and tools. It's rare to screw something up so badly that it can't be fixed, and in large part computer interfaces are designed with either implicit or explicit "undo" options (worst case, exiting a document without saving will nearly always take you back to a "clean" document). Like the monk in the you-tube video your sibling posted though (and if you haven't watched it, it's hysterical), many non-technical users worry that they will damage either the computer or their data if they mess around with stuff.

        Personally I consider this attitude somewhat foolish (as I think do most people who fall into the "geek" category), but it's fairly common. Of course if you try to explain to the person that they're unlikely to hurt anything by playing around, they will immediately tell you that it's easy for you to say that, as you're an expert unlikely to hurt anything. It doesn't really occur to them that most of the expertise you or I have comes from a willingness to experiment.

        • by MobyDisk ( 75490 )

          I think it's more fear that they'll break something.

          I wonder if they had that same fear the first time they picked-up a hammer, a screwdriver, a knife, or a ball? Clearly not: children have no such fears. I think as people get older, if they are not vigilant to maintain everyday skills, they become afraid to experiment at all. So they stop learning.

          • To be fair, it's pretty obvious when a hammer is going to cause damage. 'rm -rf ~joe/*' and 'rm -rf ~joe /*' are pretty close to one another. And usually computers were first encountered at work - where people are paranoid about screwing something up.
        • I can't tell you how many times I have heard a beginner told that "you can't break something", and in 5 minutes they have. And by break, I mean make a change that only an expert user can undo.

          The inevitable response from the expert is "well, I didn't expect you to do that!"
      • This is usually a learned behavior. Usually these people used to click pretty much everything, and ran into a situation where something they did "broke" the computer. Either they removed a shortcut or deleted a file but whatever they did, they couldn't figure out a way to restore the computer to the way it was before. They probably ended up calling their nephew (in my case) to come in and fix it, and after that they would hardly touch it afraid they would "break" it again.
      • by pspahn ( 1175617 )

        Yes, you're entirely right, and it's not even about "learning" how to use a piece of software, it amounts to someone not being explicitly told to click on something, so they don't.

        I've watched my dad use a web browser. It's absolutely agonizing. He never updates anything (he still uses Quark 4) and just gets confused by the simplest of differences between one version and the next.

        Why doesn't he explore? Because, "that's not how I used to do it". He doesn't even read menus or anything, it's all just muscle

    • I can easily make that simpler. Simply change the plus sign into a little picture of an alarm clock. Or they could change it to the words "add an alarm".

      Why would anyone naturally assume a plus symbol has anything to do with alarms? I wouldn't assume that, and I would only click it because I am trying to experiment and figure out what things do by clicking every button.

      Its stupid minimalist UI thats the problem. Designers trying to be more artistic than simple. I mean look at google chrome! You cant find an

      • It's easy enough to change that button to "add an alarm" perhaps, but the "+" is used throughout the UI to add whatever is appropriate to the application in question. It adds contacts in the contact app, calendar entries in the calendar app, new notes in the note app, and new alarms in the alarm app (along with many other things). It's what's known as a "reusable element" and once you understand its function in one application it should be a relatively trivial to figure out its function later on. Rather

    • I think it's an issue of familiarity.

      Some elderly get comfortable doing things one specific way. By the time they're forced to change, things are completely unfamiliar. When nothing is familiar, they don't feel comfortable enough to just play around until they figure it out.

      In contrast, we use tech every day. Nothing seems entirely unfamiliar to us because we have a very gradual introduction into new things. We intimately know the old feature, so much that the new one feels obvious and with immediate be

    • Simplicity is fine, but only if you can figure it out. An alarm clock is not supposed to be complicated. If a user can't figure out how to set the clock, then it's a UI fail.

      Even if it's shiny and cool.

    • Couldn't that also be interpreted as "necessarily simple"? Older generations don't get it not because of its complexity, but its simplicity. They might understand better if everything had a label and step-by-step info, but for the rest of us that do understand, this just adds complexity when it might not be needed.

      Age has little to do with it. Once a person, young or old, has seen it done once or twice they get it. Its really nothing more than younger people were the early iPhone/iPod touch adopters, or early adopters of computers that used similar user interface widgets. If you hand an iPhone to an older person today its more likely to be something new and different, just like it was for a younger person five or so years ago.

    • by Chemisor ( 97276 ) on Friday May 20, 2011 @06:20PM (#36197200)

      It only seems "simple" to you because you have seen phone clock interfaces before and know that you have to "add an alarm". To someone who has only used physical clocks that's a ridiculous idea. You don't add an alarm; you set it. On analog clocks you would have an extra hand that would be the physical embodiment of the alarm.

      A much more obvious thing to do would have been to use a an alarm icon (a bell is fairly well-known, a "jumping" alarm clock might be another) with a superimposed plus instead of just a plus. You might also write "Alarm" on it. That way it becomes clear that the button has something to do with alarms. A bare plus has no meaning at all.

    • It would be better if instruction manuals were actually useful...

      I'm a long time computer user and not to the old fogie stage yet, but I wouldn't know that a + sign means to add an alarm either. And I continue to be disgusted by incorrect or missing information in instruction manuals. They seem to be more interested in putting them 10 different languages than making sure they are correct and understandable. I would bet that younger people are more willing to just "figure it out" while the first thing a more

    • I'm a programmer and a sysadmin and when I first started using iDevices (iPod touch, iPhone, iPad specifically) I found the user interfaces to be completely non-obvious. Now, having worked with them for a few months I find it simple enough, but that initial curve is actually fairly steep. Couple that with not wanting to learn new things anyway and no wonder the elderly aren't adopting it.
    • by digitig ( 1056110 ) on Friday May 20, 2011 @07:13PM (#36197712)
      If "it's not easy to get back again" then it's a fundamental design flaw that discourages experimentation, not "necessarily simple". And having the plus sign is an unnecessary complication -- tap the clock on my phone and it goes to the alarm setting menu with other configuration available from there. I'm sure the younger generation could cope with that too.
  • What? (Score:4, Insightful)

    by DWMorse ( 1816016 ) on Friday May 20, 2011 @05:19PM (#36196548) Homepage

    In rebuttal, I offer my personal anecdote: My mom has never had such an easy time using technology, than now in 2011, now that I've set her on iOS and soon, OSX. Just because the older generation doesn't find it intuitive doesn't mean they can't figure it out with a little tinkering, or at worst, very little Applecare phone support. To insinuate they can't set freaking alarms because they might accidentally push the wrong thing at first is insulting.

    • I totally agree and will add my own as well:

      My grandmother plays Wii like a champ, backs up her computer more frequently than most people, and has an Android phone. My grandfather doesn't recognize his own daughters anymore, but can still use an iPad...

    • It's about a new "language" for them to learn - not that it escapes them intellectually. And, its NEVER been easier then now, and I expect it will get easier moving forward. But, the brit has a point - it is still way less than optimum; partly because who fuck is TEACHING them.

      I spent three - two hour secessions with mom (70+) and now she is cutting movies, playing solitaire and emailing like anybody else, with attachments, particularly word docs, Printmaker docs, and photos. not bad. not at all bad.
    • In rebuttal, I offer my personal anecdote: My mom has never had such an easy time using technology, than now in 2011, now that I've set her on iOS and soon, OSX. Just because the older generation doesn't find it intuitive doesn't mean they can't figure it out with a little tinkering, or at worst, very little Applecare phone support. To insinuate they can't set freaking alarms because they might accidentally push the wrong thing at first is insulting.

      Of course, not every elderly person has somebody to "set them up" on iOS or OS X. Would you mother been as successful with this if she had to do it by herself, with no help from her son? If not, then your rebuttal falls short of the mark.

      Face it, this is a serious issue for software developers. If iPhone sales are going to be limited to the young, well, that market is already saturated and the declining birthrate doesn't bode well. Apple needs to expand it's market to those people who are current users

  • How many people here would take one look at that UI and assume that the + meant 'mod this up'?

    But seriously, why a plus sign inside a square? Why not an oblong marked ALARM?

    • Sometimes a word is worth a thousand pictures. Changing the "+" to "Add Alarm" probably would have solved the problem.

      • Exactly. English words are consistent for English-speaking people. Icons have no such standard, and having to memorize the functions represented by icons when they are application-specific (take a look at Solidworks 2011 if you want to see some examples) is an unnecessary hurdle.

    • by Jiro ( 131519 )

      Because if you use the word "alarm" you have to make a different version of the clock for each country you sell it in that speaks a different language.

      It's the same reason why everything with an on/off switch has "1" and "0" or icons, not the words "on" and "off".

      • Because if you use the word "alarm" you have to make a different version of the clock for each country you sell it in that speaks a different language.

        It's the same reason why everything with an on/off switch has "1" and "0" or icons, not the words "on" and "off".

        Just as the other text for apps in an iPhone or Android phone appear in the native language the phone is set for, wouldn't the word "Alarm" also do that? However, you really wouldn't need the word "alarm" anyway. A picture of a bell or an icon that actually looks like an alarm clock would probably suffice.

        • by tepples ( 727027 )

          Just as the other text for apps in an iPhone or Android phone appear in the native language the phone is set for, wouldn't the word "Alarm" also do that?

          The application chooses the string from a list with one entry for each language; it doesn't machine translate (I hope). You'd still need to add a string for every locale where you distribute your application.

    • by Tackhead ( 54550 )

      But seriously, why a plus sign inside a square? Why not an oblong marked ALARM?

      Because if you replace all the words (and menu options) with pictures, you don't have to pay a team of translators any money to localize the product for non-English-speaking markets.

      You make a claim like "menus confuse people", take the menus off, and claim that you're saving screen real estate. It sounds trendy, because desktop monitors are now down to 1080 vertical pixels, and all the sexy gadgets are tablets and phones wit

      • Your snarking at web browser design misses that all of these things already existed in non-icon form, so you're not saving money in that path, you're costing money. Also, any application that is accessible to the visually impaired includes localized strings anyway. And really, the iPhone has "+" at the top right, but it already has "Edit" at the top left, so it's really just a marginal cost.

        The real cost savings come from rapidly implementing entirely new features, because you do not have to design and ma

  • by Daetrin ( 576516 ) on Friday May 20, 2011 @05:22PM (#36196574)
    Is there some way to make such things simple enough for the elderly without detracting from the functionality for younger people? iPhones are far from the only thing that the elderly have trouble with, but it doesn't seem wise to tailor everything in the world to cater specifically to them. If designers can't find a way to make a device useable by both the young and the old without compromising on the usability for either group then there really ought to be two separate devices. I've certainly seen enough infomercials to know there's certainly a large market of elderly people out there you can market to directly.

    I'm certainly sympathetic since i plan to be elderly myself one day, but i'd like to hope when that day arrives i'll either try to learn how to use whatever new-fangled thing the kids are into, or use alternative devices/software/whatever that fits my needs. (Kind of like how the first thing i do after installing Windows 7 is make extensive modifications to give it a "Windows Classic" theme.)
    • by Hatta ( 162192 )

      No. Elderly people want what they are used to. They are not used to computers, simple or complex. There's nothing wrong with giving Grandpa a stand alone alarm clock, he already knows how to use it. The ultimate solution for this problem is attrition. Eventually everyone alive will have grown up with computers.

      • Elderly people are still people. Give them a reason to learn something, and they will. I've seen it mentioned several times that the Kindle has been a massive success in the senior citizen crowd, in no small part because every book becomes a large-print book that they don't have to find their reading glasses for. Or consider the reaction of people now in their late 60s to, say, the Internet. It was a maybe thing, until they realized that it meant they could actually have relationships with their grandchild
    • by Chemisor ( 97276 )

      Draw a picture of a bell under the +. That will be enough.

  • by nysus ( 162232 ) on Friday May 20, 2011 @05:23PM (#36196596)

    Just a couple of weeks ago, I was sitting next to a gentleman, 55 to 60 years old, who was having a great deal of trouble performing what most of us would consider the most basic of functions such as how to add a new city to the iPhone's built-in weather feature. He had just purchased the phone and so I helped him through the process. It was quite an eye-opener for me. He had not even figured out how to appropriately tap on the screen (he was pressing on it as if it were a mechanical button and so his touches never registered). He was constantly misspelling with they keyboard, could not figure out how to correct a mistake. It took him about a dozen efforts and maybe about 3 minutes before he successfully typed in Boston.

    I would estimate he would need a one-on-one training of at least a few hours in duration before he could being to use some of the other iPhone's most basic features.

    • No interface on earth is instantly usable for people who don't have a baseline competency in similar interfaces.

      My great grandmother -- born in 1911 -- never learned to drive a car in her life. Is it because car interfaces are poorly designed, or because she just didn't care enough to bother learning? Considering millions of people her age had no problem with it, I'd argue the latter.
      • ...Considering millions of people her age had no problem with it, I'd argue the latter.

        I'm not sure about that...from what I've seen of elderly drivers it very well could be a bad interface. Every time I've almost been run off the road on my motorcycle it's been an old person. In fairness though the last one (4 days ago) was texting and driving so at least they have one interface down reasonably well.

    • he was pressing on it as if it were a mechanical button and so his touches never registered

      It's pretty amazing how frequently this happens. Even with buttons that are obviously just electric switches, old people will mash down as hard as they can, as if the force they apply is correlated to the success of the operation. Maybe they're reminded of old typewriters, where you actually did have to mash down on the keys to make it work.

    • by MobyDisk ( 75490 )

      he was pressing on it as if it were a mechanical button

      That is very insightful, since the entire point of a touch screen is to look and act like a mechanical button. Obviously it doesn't, but perhaps most of us have adapted to the screens and understand that. I remember the first time I used a microwave with those non-tactile buttons and I hated it.

      It took him about a dozen efforts and maybe about 3 minutes before he successfully typed in Boston.

      THAT might have as much to do with the keyboard as his learning curve on the touch screen.

  • by Anonymous Coward

    I help my 82-year-old dad out every day on his computer. He has used them in business since the 80's and old CRT 80x24 text interfaces worked much better for these geezers.

    I got him a huge 27" widescreen which really helps him read the text, but windows are so big, the menus might as well be in a different country from the minimize/maximize/close and it takes forever to get his eyes from the center of a window to a status display at the bottom.

    What many older people need is less options or someway to put al

  • by Anonymous Coward on Friday May 20, 2011 @05:26PM (#36196624)

    Once someone sees that the plus sign adds an alarm, then they'll know the plus sign adds an alarm. You only have to figure it out once.

    I'm not elderly, but I'm old-ish (63) and I watch people my age struggle with very simple things because rather than learn the underlying concepts, they learn by rote. They learn "the second icon from the left does this". They don't bother to learn what the computer is really doing. Use words like "filesystem" and their eye glaze over. But without basic understanding of the technology, everything on the screen is going to be "magic" - if you don't understand the whys and wherefores, there is no hope of ever accomplishing anything but rote memorization.

    I'd say about 90% of the time, they are perfectly well able to understand what's happening if they want - they just don't want to. You can't fix "don't want to learn". The ones who value learning, who don't have a culture of shutting of their brains and refusing to ever think, do just fine.

    Of course this doesn't apply once certain disabilities like Alzheimer's enter the picture - that's a different problem and one no UI is going to fix.

    • Fully agreed. All the elderly family I have in Greece have feature phones, but they learn by habit: press some numbers, press the green button, then the red. Nothing else. They don't even know how to program their TV, someone else has to set it up for them the first time, and then they remember by habit that button with a 3 on it, is "News channel", for example. They don't want to learn how things really work. I tried. I tried with my mom, I tried to explain her the logic, but she prefers to write down on

    • Even more to the point, once they figure out that the plus sign adds application appropriate things, the knowledge should carry over. As I said in an earlier post, the "+" is a widely reused element in the UI, It adds contacts int eh contact app, events in the calendar app, cities in the weather app... All in all the default Apple applications have a very consistent UI. Once you understand how one of them works, you're a long way toward understanding all of them (a UI which many third party apps also adop

  • These things do come with manuals.

    • by tepples ( 727027 )

      These things do come with manuals.

      More and more lately, these are electronic manuals, and you already have to know how to navigate the device in order to read the manual.

  • Computers function in a different way to physical objects.

    Making people accept this is far, far simpler than trying to force every computer idea into a real-world analog.

    Stop treating a computer like a car or bike - you can't learn it in a week and you'll be spending a lot of your life using one so get it right. Even if you're old you can learn (and it'll do your mind good too).

    Everyone can name ways that an application should be simpler. Trouble is, ask 100 people and you'll get 100 different answers, many

  • Its most anyone that isn't tech-savvy.

    What ever happened to interfaces deigned for a *user*, not a techie ( like we had with the newton for example )?

    The interfaces should adapt to us, not the other way around.

  • It has one of the most intuitive user interfaces ever. So much that even the noobiest of tech/computer users can figure it out. Perhaps if they can't understand how the amazingly easy to use the iphone UI is, they need a dummies book or one of these [intomobile.com]
  • If you can read, you can use tech. Read the manual. Google it. look up a video on youtube. Or just ask someone.

    there is no excuse for ignorance. In tech or anything else. Saying that older people have some special disadvantage is just a load of crap.

  • At the risk of invoking Plato's rant about youth: this isn't very new. The last couple waves of technology befuddled new users too. Remember all the VCRs permanently blinking 12:00 in the 1980s, followed by microwaves doing the same in the 1990s? And that's just sticking to old jokes about digital clocks. But I'm sure most of us who're old enough, or knew others who were old enough, have heard a wealth of similar things about devices of the same decades... and similar things about cars dating to several dec

  • The problem is that the elderly stopped learning new things for too long so the tools atrophied. A young person with a brand new smart phone would come in with a clear head and immediately start mentally mapping out functionality and figuring out, maybe not even at a concious level, all the underlying metaphors. They quickly intuit the best way to tap the screen. They figure out how to back out of things and start pressing buttons to see what everything does.

    Being old doesnt prevent you from doing any of th

  • That's just basic bad design. I would (and have) done the same thing with some alarm widgets. Actually, you would be surprised how many alarms are badly designed - probably because anybody can program a simple clock. Why the heck would you want to change the clock design by tapping it? Is that the main functionality of the clock? The clock change function should be in the menu somewhere. And if it rings, I expect to see a great big bell or button which I can hit to switch it off. Preferably I should be able

  • i am 36 (Score:5, Interesting)

    by drolli ( 522659 ) on Friday May 20, 2011 @05:50PM (#36196880) Journal

    and i am alienated by todays user interfaces. What alienates me most is that showing the keybinding seem to be a thing of the past and pure text menus are not possible to turn on.I like a simple alphabetically sorted list to start apps, which would take less space and not be as weird as having 9 screenful of badly designed, stupidly copied or sometime identical icons. And sorry on the most alarm clocks on smartphones you could instead of the plus easily write "add/set alarm" - no lack of space there.

  • by roc97007 ( 608802 ) on Friday May 20, 2011 @05:50PM (#36196886) Journal

    It's not an older/younger thing, it's entirely an "unnecessarily complicated or obscure" thing. Sure, younger people have more experience with enigmatic interfaces, and are more likely to keep trying without getting frustrated, but that doesn't necessarily mean that the interface in question appeals to young folk. For instance, a "set alarm" button would be more immediately understood regardless of age, or (and this point is completely missed) degree of geekiness.

  • Both gone now, but vastly different in use of tech. I don't believe my mom ever used a PC, but my dad was a UNIX and MS Office instructor up until a few months before he passed at 81. I remember my son, at age 4, trying to teach my mom how to use MSPaint.

    In 2050, when you whippersnappers are 70 and 80, all the 'kids' will be ragging on you geezers about how you don't 'get' the new-fangled brain-silicon interface, with the 3DHD corneal implants.
  • by EdwinFreed ( 1084059 ) on Friday May 20, 2011 @06:02PM (#36197010)

    Maybe he's using a custom clock app or something, but on my iPhone the built in clock app has four clearly labeled mode setting buttons at the bottom: "World Clock", "Alarm", "Stopwatch", and "Timer". Pressing the one called "Alarm" to set an alarm seems, well, obvious, and when you do that you get a screen saying "no alarms" and exactly one "+" button you can press, so unless you simply freeze up at that point I don't see how this can be so confusing. In particular, no clock face is displayed at this point so there is no possibility of, "Pressing the clock image takes you through to choices about how the clock is displayed, and it's not easy to get back again."

    If you want to criticize the alarm and calendar stuff on the iPhone, a better place to start is the spinning dial thing used to enter times. (Which is what comes up once you press "+".) A lot of people dislike this and find it hard to use. I don't find it difficult personally, but I have to admit I'd prefer a numeric keypad.

  • Designers tend to like cute and clever. Sometimes this results in interfaces that are incomprehensible. Apple is the worst offender. It has nothing to do with age.
  • These are two things that the elderly stereotypically are not accustomed to and have not had as a constant requirement throughout their lives. I suspect this will be recognized as a generational issue. The elderly of tomorrow who are today's Gen-X, Gen-Y & Millennial adults will not have this problem. We've been born into a culture that will mow you down if you don't keep yourself up to date.

  • He had mastered Mario Brothers using a cheat code we installed for him, but could not rescue the princess in level 5. He finally became enraged, ripped the little gray Nintendo box from the TV plugs, and smashed it to the ground. Ok, that was a long time ago, but he's not going near my laptop.
  • Once you've released your UI, it's game over. Go away and design something else please. My experience in family tech support is that patches that try to tweak the UI inevitably result in long angry rants over the phone as if I was the one who decided to "mess it up".
  • Computers have this concept of things being n-ary instead of single or double. Ex: Most real alarm clocks have 2 alarms - so there is either a switch with 2 positions, or 2 sets of buttons. So you might have on1, off1, on2, off2, etc. But a computer alarm might have a max of 2 million alarms, so instead you have the concept of adding or removing an alarm.

    This concept frustrates computer illiterate people all the time. It happens on cameras, thermostats, televisions, etc. Remember when remote controls h

  • It's easy to blame the elderly. Most of the posts on this topic are doing just that. However, if Apple wants to blame the elderly because they can't figure out iOS, that's fine. I'm sure Samsung or HTC or somebody will be more than happy to sell them a phone running Android that is configured to meet the needs of the purchaser.

    When Apple was pretty much the only smart phone around, they could come out with whatever interface they wanted and uses had to adapt to it. That is no longer the case. And, sinc

  • 1px wide window borders -- for someone with shaky hands these are nearly IMPOSSIBLE to resize (most windows don't include the drag handle).

    Nearly all the other non shitty (high contrast) themes it comes with also have 1px wide borders. I get that the border area is destined for a clean look, but does the look have to be tied to the usability? Can't an invisible area around the window provide the drag handle features in a size not dependent on the pixel count of the border? The bug report response says

  • by SEE ( 7681 ) on Friday May 20, 2011 @06:56PM (#36197552) Homepage

    They're elderly, so all we have to do is wait them out.

  • by Rob the Bold ( 788862 ) on Friday May 20, 2011 @08:10PM (#36198144)

    Before commenting, please read "Design of Everyday Things" by Norman.

    Please.

  • by Dracos ( 107777 ) on Friday May 20, 2011 @09:09PM (#36198456)

    The problem for anyone who finds it "difficult" to use a piece of technology has nothing to do with the interface, but rather with their fear of the technology itself, or fear of "messing something up" or "not doing it right". It's a confidence issue, not comprehension.

    If very small children can pick up a Nintendo DS or a LeapFrog device and use it with little instruction, then it stands to reason that, all things being equal, the elderly should be able to use a cell phone just as easily, if for no other reason than they learned how to read decades ago. Blaming the UI is absurd.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...