Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Android Google Software Iphone Media Open Source Operating Systems Television News Apple Build Hardware Technology

Google's Android N OS Will Support Pressure-Sensitive Screens (theverge.com) 68

An anonymous reader writes: In the latest Developer Preview 2 of Android N, Google introduced new "Launcher shortcuts" to the beta OS. It allows developers to "define shortcuts which users can expose in the launcher to help them perform actions quicker." It's reminiscent of Apple's "3D Touch" feature found in the iPhone 6s and iPhone 6s Plus, which can allow for specific parts of an app to be displayed in a pop-up menu when users forcefully press on an icon or other miscellaneous piece of information developed with the feature.

As mentioned in Phandroid's report testing the "setDynamicShortcuts(List)" feature, Google offered four different scenarios where Launcher Shortcuts make sense: Navigating users to a particular location in a mapping app, sending messages to a friend in a communication app, playing the next episode of a TV show in a media app, or loading the last save point in a gaming app.

"Google says that the manufacturers who build Android devices wanted this use case addressed by the OS itself," according to The Verge, so that developers "can code for all Android devices instead of reinventing the pressure-sensitive wheel for each OEM."
This discussion has been archived. No new comments can be posted.

Google's Android N OS Will Support Pressure-Sensitive Screens

Comments Filter:
  • by PsychoSlashDot ( 207849 ) on Sunday April 17, 2016 @04:12PM (#51927893)
    "Google offered four different scenarios where Launcher Shortcuts make sense: Navigating users to a particular location in a mapping app, sending messages to a friend in a communication app, playing the next episode of a TV show in a media app, or loading the last save point in a gaming app."

    How about a button that you can see, that does this stuff when you click it? None of these use-cases justify variable pressure sensitivity. Basically, drawing applications do, and that's about it. If this was actually somehow beneficial, we would've seen pressure-sensitive mouse buttons standardized two decades ago.
    • Mobile interfaces are severely constrained in many ways that desktop applications are not: small amount of real-estate, a relatively imprecise stylus (your finger), no keyboard shortcuts, no context-sensitive information on hover, and so on.

      As such, I think it's worthwhile to at least experiment with another dimension of interaction, given all these inherent limitations of the platform UI. The trick with these sorts of "interfaceless interfaces" as you put it is to ensure that there's always a visually def

    • by Hadlock ( 143607 )

      Ultra-sensitive force sensors at low prices didn't exist five years ago, is the primary point I'm going to use to defeat your logic here. You could buy them for $50-90 each, but that's not practical to include in a $200 device that's already being sold on razor thin margins. Between high end bicycle cranks and Apple's new trackpads, it's opening up the market and lowering prices dramatically.
       
      I do agree though, there's no killer app yet for this.

    • by Kartu ( 1490911 )

      Phone screens have gotten bigger, but are still small.
      Clamping small screen with buttons (or forcing select first, then click a button, which is even worse), so that someone doesn't need to learn a thing or two... what about "no, thanks"?

      What you see is an object that you interact with. E.g. a contact in a list of contacts.

      Then you interact with it.
      Swipe left - message. Swipe right - call. Press and hold - for menu. Not hard to learn, very comfortable to use.

      • Swipe left - message. Swipe right - call. Press and hold - for menu. Not hard to learn, very comfortable to use.

        How do you learn it? Discoverability is a core concept in UI design. It's one of the reasons that the menu bar, in spite of its other failings, is still a pretty good UI model: everything that you can do is visible in the menu bar and a user can explore the functionality of a new application by browsing the menu. Gestures tend to be terrible for discoverability. Do I need to go and read some documentation to discover that I can swipe those things (or read a pop-up hint that either pops up once and then

        • by tlhIngan ( 30335 )

          How do you learn it? Discoverability is a core concept in UI design. It's one of the reasons that the menu bar, in spite of its other failings, is still a pretty good UI model: everything that you can do is visible in the menu bar and a user can explore the functionality of a new application by browsing the menu. Gestures tend to be terrible for discoverability. Do I need to go and read some documentation to discover that I can swipe those things (or read a pop-up hint that either pops up once and then I fo

    • I think it would be interesting, in browsers for example, to be able to select text and, with a little more pressure, have a context menu or a dictionary( like in kindle) pop up but only select and show "copy and paste" otherwise.
      • I think it would be interesting, in browsers for example, to be able to select text and, with a little more pressure, have a context menu or a dictionary( like in kindle) pop up but only select and show "copy and paste" otherwise.

        It's hard enough to master the coordination to select reasonable sized text on touch screens. No need to make it more complicated when a simple alteration in pressure causes a different outcome. Try this; selecting text works like is already does, and there's a menu/ribbon/UI element that appears when you have something selected, and that lets you DO THINGS, such as Cut/Copy/FindDefinition. Wow.

        The more likely a UI is to cause a new user to ask "what did I just do", the worse it is.

  • by Anonymous Coward

    what the hard press does until you stumble across it. For our iOS app, we have great usage metrics for every action, and only a tiny portion of users use the 3D taps that we spent a ton of money and time adding to the app. Yes they'll become more useful as designers get better at creating useful features and as users become more acquainted with them, but they just aren't intuitive like dragging one finger to scroll or two fingers to zoom.

    • Which is why old people love 3D Touch. After decades of using bad command line interfaces they want to bring the same uninuitive interfaces to mobile.

      • by GuB-42 ( 2483988 )

        Command lines are by nature more intuitive than GUIs. They require you to learn commands (or have a reference handy), and to type, and they are not flashy, but you can't beat the intuitiveness of "giving orders to the computer".
        3D touch is the opposite, it adds a whole new axis : not only you need to know where to press, how long, but also how hard.

        • Command lines are by nature more intuitive than GUIs. They require you to learn commands

          I don't think the word 'intuitive' means what you think it means.

          • by narcc ( 412956 )

            You missed this part:

            you can't beat the intuitiveness of "giving orders to the computer".

            Today's "virtual assistant" programs are essentially command lines, just with a more "natural language" style interface. We've seen the same approach in older command line interfaces like Lotus HAL. No matter how advanced the GUI, you're not going to beat setting alarms and appointments with a quick voice command. It's as intuitive as it gets.

            • In some situations, sure but to do basic, common GUI tasks like selecting a passage of text in a document it's much faster and more intuitive (and efficient) to click and drag your mouse than to use voice or text commands to explain to the computer which passage you want to highlight.
              • by narcc ( 412956 )

                Very true. There is no panacea.

                • Yeah it all depends on the task you're doing. I would say in the vast majority of cases you can pick one interaction method that is the clear winner in terms of being most intuitive, but not one for everything.
        • I know the instinct to jump to CLI's defense is strong, but you should really pick your qualifiers more carefully. CLIs have many advantages, such as power, flexibility, extensibility, etc, but "intuitive" is certainly not one of those qualities. Requiring a reference or having to memorize commands before you can do anything pretty much rules out "intuitive" by definition.

          I wouldn't claim 3D touch is intuitive either, incidentally. It's pretty much something that needs to be explicitly taught, but it's a

        • ... more intuitive ... require you to learn ...

          Pick one.

        • Command lines are by nature more intuitive than GUIs. They require you to learn

          Err...

  • Capacitive touchscreens work great ... until they get wet. With the recent push by Samsung into the water-resistive phone/tablet market, I imagine we'll be seeing an Android device that works entirely underwater within 12 months. Imagine taking your phone into the surf or the pool. It's coming!
    • The Galaxy S7 Edge appears to already work under water [youtube.com]
      • I'd be curious to know what kind of water that guy used. Distilled water is actually not very conductive. Salt water is. The phone might not work as well in something other than culinary water.

        The fact that Galaxy S7 Edge worked at all underwater is impressive! The guy was able to swipe the screen with his finger, but I don't think it worked every time. I'd be interested in seeing a single-press test, and seeing if the phone can accurately locate the finger (XY) on the screen. Swiping is one thing, but if y

  • but google, i beg you: let rudimentary functions like opening a menu identical.

    In the last 4 years if have seen more pointless UI changes on Mobile devices than i like

    Thanks.

  • Capacitive input devices such as touchpad's and touchscreens have always been touch pressure sensitive. It's a very simple principle. The harder you press, the larger the contact area that your finger makes with the screen / touchpad. My PS2 / Serial interface'd ALPS Glidepoint from almost 20 years ago could do this.
    I wonder why it's taken so long for someone to realise this could be useful as yet another input vector. I should add that the Glidepoint never used this for input, it just showed a bigger circl

    • Repeatability. That is all.

      Capacitive touchscreens are variable with pressure, but they are also variable with body chemistry, humidity, sweat, whether a person has just washed their hands, etc.

      That would make for a very difficult interface if it changed everytime I picked up my device.

    • touchpad's and touchscreens

      Is the touchscreen jealous because it doesn't have anything but the touchpad does?

  • So, Google is deciding to fix its past mistakes with not implementing multi-window, and now they're actually looking forward and supporting features that aren't in any existing phones. Great!

    Maybe we'll see decent keyboard/mouse support next, given that there are already netbooks out there running Android.

    • by Anonymous Coward

      God, so much this.

      They bitched their asses off at Samsung for making an actually fairly decent Windowing system for Android, then they go and make a half-assed piece of crap in the recent betas that doesn't even come close to how Samsungs works.
      Hell, a "floating window" app is better than Googles Windowing system.

      On my tablet right now (the first Note 10), I have Chrome, S Note and Sketchbook open in the back.
      I have them basically Always On Top above the sketchbook when working between the 3.
      I also have a

      • by dargaud ( 518470 )

        On my tablet right now (the first Note 10), I have Chrome, S Note and Sketchbook open in the back. I have them basically Always On Top above the sketchbook when working between the 3.

        I envy you. On my Android phone whenever I switch app, it closes the former one. If I switch back, I have to reopen and go back to whatever I was doing. And it can take up to 15s just to switch app. Fucking useless, and with a phone that was top of the line last year too. I don't understand how they design this garbage.

  • I look forward to see how it goes. Thanks for the info. Hug, Allan http://www.geradordesenha.eu/ [geradordesenha.eu]
  • None. There are none.

    Assuming that this is the same as 3D touch (which it looks to be), that is... well, at best 3D touch is a useless feature that doesn't work reliably and you can ignore. At worst, it's yet another overloaded function on an interface that gets in the way and causes problems.

    It's a pity that phone / OS manufacturers are too busy focusing on silly gimmicks, instead of rock solid reliability, responsive UIs that don't require a ton of CPU / GPU power and longer battery life.

  • I don't mind the idea as long as it can be globally disabled. I don't like the idea that if I'm particularly heavy handed one day I'll tap an icon expecting one action and find it does something different. Or I jab at it when I'm in a mood and again, different action.

    And how does this help with discoverability? That seems to be going out of window of modern UIs. Want to delete something? Er, is it a swipe? A long tap? Could it now be a harder tap? Who knows - nothing in the UI to help.

Always draw your curves, then plot your reading.

Working...