Huawei quietly introduced a brand new feature that few individuals even noticed. It's abolished the Navigation Bar by integrating the three main functions into a sensor button, this is indeed a great news; since the Navigation Bar has barely been modified in the past decade.
This changes, I think influences how you use android quite considerably. To see why let's recap how people have grappled with android thus far.
Buttons.
Google's first version of Android was basically a BlackBerry clone, and it had been under development for three years, when in January 2007 Apple unveiled the iPhone. The Android team made a dramatic pivot to emulate Apple's new UI standard, a little too closely for Steve Jobs' liking.
Android's UI rapidly evolved from "mainly QWERTY with some touchscreen" to "mostly touchscreen" to "full touch" very quickly, but retained a few BlackBerry quirks. Blackberries had a back key and a menu key and so did the first full-touch Androids.
The devices retained these buttons for second-generation Android 2.0 devices, such as the HTC Hero, which had six hardware buttons (home, menu, back, search and two call keys: the old "send" and "receive").
Android's UI rapidly evolved from "mainly QWERTY with some touchscreen" to "mostly touchscreen" to "full touch" very quickly, but retained a few BlackBerry quirks. Blackberries had a back key and a menu key and so did the first full-touch Androids.
The devices retained these buttons for second-generation Android 2.0 devices, such as the HTC Hero, which had six hardware buttons (home, menu, back, search and two call keys: the old "send" and "receive").
Given the large pent-up demand for a modern phone experience (Apple was rationing its iPhone through territorial carrier exclusives, and neither Nokia nor BlackBerry may manufacture something competitive), android became a megahit. gradually the buttons began to fade.
Designers honed the keys right down to three or four: invariably some permutation of home, menu, back, and search. Samsung's runaway success with the S and S2 (menu – home – back – search) proved you didn't want quite so many.
Designers honed the keys right down to three or four: invariably some permutation of home, menu, back, and search. Samsung's runaway success with the S and S2 (menu – home – back – search) proved you didn't want quite so many.
And since then there have only very been two changes. Android 4.0 appeared in 2012 permitting designers to dispense with dedicated offscreen buttons and replace them with an onscreen navigation bar, to avoid wasting money and scale back the dimensions of the bezel. and the menu key became mostly superfluous and replaced by a task switcher.
That brings us to the generic slab we have these days. only Samsung has persisted with a true hardware button. but the shift to onscreen navigation has had some downsides. It's rather more fiddly for the user and consumes up pixels that might somewhat be utilized by an application.
Huawei's plan is to feature swipe gestures to a front fingerprint sensor. The onscreen navigation bar is still there if you do not want to use the detector, and you'll be able to restore it, but it's turned off by default.
Huawei's plan is to feature swipe gestures to a front fingerprint sensor. The onscreen navigation bar is still there if you do not want to use the detector, and you'll be able to restore it, but it's turned off by default.
Sensor overload
Huawei had already experimented with loading gestures onto the fingerprint sensor in its Honour 7, a breakthrough device for the company here.
A down swipe would pull down the Notifications shade, that is one among the most common actions on a modern android phone.
A down swipe would pull down the Notifications shade, that is one among the most common actions on a modern android phone.
The P10's fingerprint sensor understands three gestures when generally in use a long press returns you home. a tap takes you back.
And a swipe brings up the task switcher (note for pedants: per Google it's at the moment officially referred to as the "Overview" button, however, most phone OEMs ignore this).
And a swipe brings up the task switcher (note for pedants: per Google it's at the moment officially referred to as the "Overview" button, however, most phone OEMs ignore this).
There's no way to customise this in current software system builds: you get what you are given.
Coming straight from another Huawei phone, the Mate 9, I instantly missed its rear-mounted sensor. just like Tim Anderson, who wrote our Hands On with the P10 did. putting the fingerprint sensing element on the rear is not popular with everybody, and makes using a flip case a little awkward, however,
I would very quickly become familiar with picking it up unlocking and using the down swipe to check new notifications.
I would very quickly become familiar with picking it up unlocking and using the down swipe to check new notifications.
I suspect most people can discover the feature in P10 accidentally. they will accidentally tap or swipe the device and realize one thing unexpected has occurred. And this is what happened to me.
Programming my muscle memory to use the device for navigation took somewhat more than a day. very cautious initially, you find a tap sends you back. And it is so much faster than shifting focus to "find the rear key" and carefully pressing. Your brain is doing less. It's much less disruptive.
Programming my muscle memory to use the device for navigation took somewhat more than a day. very cautious initially, you find a tap sends you back. And it is so much faster than shifting focus to "find the rear key" and carefully pressing. Your brain is doing less. It's much less disruptive.
Using a swipe to task switch takes a little longer than tap-to-go-back as it's a kind of toggle. Swipe the sensor once to bring up the task switcher, and once more to dispense with it.
There are some downsides of the new approach, just as there downsides of any design decision.
On previous Huawei phones swiping the on-screen navigation bar was used for a "reachability" function, created to make the top of the device accessible in one-handed use. but with no on-screen navigation bar, this is often invoked using a diagonal swipe from one of the bottom corners of the screen.
which hardly ever works. and that notifications shade remains as hard to access one-handed as on different phones. could Huawei have overloaded the sensor with another up/down features, perhaps they did and users got confused, Or perhaps the sensor got confused.
On previous Huawei phones swiping the on-screen navigation bar was used for a "reachability" function, created to make the top of the device accessible in one-handed use. but with no on-screen navigation bar, this is often invoked using a diagonal swipe from one of the bottom corners of the screen.
which hardly ever works. and that notifications shade remains as hard to access one-handed as on different phones. could Huawei have overloaded the sensor with another up/down features, perhaps they did and users got confused, Or perhaps the sensor got confused.
Android dominates mobile with over 80 per cent of new devices shipping – but little changes from year to year. It's nice to see manufacturers trying something somewhat completely different.
Nokia's N9 abandoned all-gesture Harmattan UI dispensed with navigation buttons completely. |
This user regrets that the chance to create a ground-up approach to design with all swipes, as expressed in the Nokia N9 and BlackBerry's BB10, never caught on.
Neither needed any onscreen buttons at all. android evolved so speedily between 2009 and 2011, there was no chance for a rethink. And "unused" gestures speedily become adopted by application developers.
you may also be interested in Alcatel unveils colourful A5, A3 Smartphones
No comments:
Post a Comment