1540
Touchscreens Are Out, and Tactile Controls Are Back
(spectrum.ieee.org)
This is a most excellent place for technology news and articles.
The answer is tactile buttons with displays behind them. Not sure why nobody is doing this in cars...
Because they are expensive. More importantly, how often does the function of a button is changed? Top right corner button on android is usually a back button (arrow/ x) or a profile icon. How often does a bottom navigation in an app change? Dashboard is an app that rarely changes.
I will do you one better. The screen in the button goes out. If the button changes the display based on the context, what does the button do? Is software responsible to recognize it cannot display an action and do something? What does it do? Should the user be responsible to remember what does the button do based on the context? This article is about return to physical buttons because they are reliable. Do you see any button on your cars dashboard that is unlabeled? Do you remember looking up in a manual what a weirdly iconed button does? On any piece of hardware.
This is from users perspecrtive alone.
Lets do the manufacturer. Imagine that screen buttons have SKUs. Dashboards have SKUs. Screen buttons have versioned drivers. Screen buttons need power delivery. Data lanes on pcbs. And fuck else.
Now imagine that you have a physical button. It costs cents. It closes one lane. Maybe needs power for a led.
Who the fuck wants screen buttons?
Finally. What the fuck multiple screen buttons solve that a single screen that can be any number of any buttons couldnt?
Because sure as fuck they wont solve for context, clarity and reliablity.
Because touch screens are cheap and put the onus of design onto the programmers of apps.
I think we'll see multipurpose function buttons under the display, that change function programmatically depending on what the app is doing.
That's gonna mess with muscle memory.
not really. you'll build muscle memory of the button sequence, if needed.
Yes, in another comment I explained that many years ago I wrote a software package to map program functions to the F keys, which on my keyboard were arranged in 2 columns of 5 on the left. It was before putting them in a single row across the top became the standard. The software displayed a diagram showing the key functions, laid out in the same pattern as the physical keys. I found it very easy to get the hang of looking at this diagram and pressing the right button without looking at the keys. Each keypress brought up new options, basically a multilevel submenu system, but using the buttons was faster than moving a mouse around and clicking.
Of course the concept is very obsolete for normal computer keyboards because that f-key layout isn't around anymore. But if the device had the buttons right under the screen the key functions could be displayed above them. I could see that "soft buttons" concept becoming popular.
Like a streamdeck essentially?