Connect with us

Research

Is it me or have smartphones become interesting again?

Published

on

The feeling had been hanging around for some time: anything from “a couple of years” to “since the Nexus 5,” depending on who you asked. Whatever the timescales, many Android fans had noticed smartphones seemed to have lost their spark.

As I was looking over the recent Android smartphone releases last week, something occurred to me. Seems to me, we’re coming out the other side of this drought.

Smartphones are exciting again, man — I’ll explain why.

Midrange is basically premium now

Since the dawn of the camera phone, the best cameras have been held hostage by high price tags. As of 2019, the camera quality of even $300-$400 phones can compete with ones three times the price, and it’s exciting for all concerned.

Look no further than the Google Pixel 3a. This phone costs $399, but it features essentially the same main camera as the Pixel 3, Google’s current $799 flagship.

Pixel 3a screen standing on table

The Google Pixel 3a

Not only is the hardware the same, major software features like Top ShotPhotobooth Mode, and the ever-popular Night Sight are also included.

The thing is, general users aren’t bothered about how fast a phone can open ten apps. They don’t care about the latest chipsets, or in-display fingerprint scanners, or bezel-less screens, or other premium characteristics. However, if they wanted a top-of-the-range camera experience, they previously had to pay for all of those other bells and whistles anyway.EDITOR’S PICK

Best Android smartphone cameras (February 2019)

With the Google Pixel 3a and 3a XL, this isn’t the case. These are no-frills, hassle-free phones that take incredible snaps — and probably the best example of this kind of device I’ve ever seen. Better still, they even come with a 3.5mm headphone jack, and are set for fast updates. What a package!

These are no-frills, hassle-free phones that also take incredible snaps — and probably the best example of this kind of device I’ve ever seen.

The Pixel 3a isn’t just a 2019 midrange one-off, either. Samsung rolled out a much-needed revamp of its mid-tier this year. Galaxy A phones always sold well, but largely off the back of the Samsung name, rather than their inherent quality — Android reviews readers have known for years there are better options.

However, the most recently Galaxy A phones have been some high-quality hardware. Just look at how cool the Galaxy A70 below is. It looks like a flagship in its own right.

Samsung Galaxy A70 in hand showing back

The Samsung Galaxy A70 has an awesome design.

Heck, look at how cool the Galaxy A80 is:

The Galaxy A80's rotating camera shown off in a rotating video.

Yugatech

Galaxy A phones don’t just have the premium look though, a few of them arrived with features even Samsung’s flagships didn’t yet have. The Galaxy A80 above had Samsung’s first sliding, flipping camera, and the Samsung Galaxy A9 2018 was the first quad-camera smartphone, well, ever.

Critically, these are not flagship phones with all the good bits stripped out to make them cheaper. Picking up a midrange phone doesn’t just have to be about saving money anymore, it can be about getting the features you want. The standard Google Pixel 3 and Pixel 3XL don’t have a headphone port, for example — an essential feature for some people they could get on the similar, less expensive Pixel 3a. The premium Galaxy S10 Plus is capped at three, static, rear cameras, unlike the Galaxy A80.

The Redmi K20 Pro phones.

Xiaomi

This trend of remarkable midrangers isn’t set to stop anytime soon. In the coming weeks w’ll have the Snapdragon 855-toting Redmi K20 Pro, a new line (the K series) from a fairly new sub-brand (Redmi) that looks set to challenge even the best of the flagship competition.

What a time to be an Android fan. There have always been low-cost smartphone options that offer high-end features, but I don’t think they’ve ever been as impressive as they are right now.

Second-tier OEM, first-rate flagship

Another sign of Android’s recent revitalization arrives via some of the less popular OEMs.

When a person thinks of phone brands, ZTE is unlikely to top their list. Yet the flagship it released last month, the Axon 10 Pro, is a standout phone.

Smartphones have always had their low-cost options that offer high-end features, but I don’t think they’ve ever been as impressive as they are right now.

The Axon 10 Pro is a particularly interesting device because, on the surface, it is just a premium phone. Its specs include a Snapdragon 855 chip, up to 12GB RAM, up to 256GB storage, triple rear cameras, a 4,000mAh battery, and a 92 percent screen-to-body ratio. It just happens to also start at a lower price than most premium Android phones — 599 euros (~$676). For comparison, the Galaxy S10 starts at 899 euros (~$1,015) and the Huawei P30 Pro starts at 999 euros (~$1,128).

ZTE Axon 10 Pro 4G screen top down

The ZTE Axon 10 Pro

Then there’s the Asus Zenfone 6. It’s powerful, it has an awesome, cassette player-esque flip-up camera, and a bunch of premium specs. “If you want 90 percent of a flagship for 50 percent of the price, this is the phone to beat,” we said in our review. The Zenfone 6 also doesn’t look like any other smartphone, but in a purposeful way, rather than just for the sake of standing out.

Asus Zenfone 6 back of device

The Asus Zenfone 6

The Realme X is probably the best Realme phone yet — it looks glorious and is going on sale in China for the equivalent of around $220 (we should have our full review of that ready in the next week or so).

Then you have a brand like OnePlus. It’s still not quite household name in the West, and far from the top of the Android platform in terms of sales, but nonetheless delivers absolute standout phones like the OnePlus 7 Pro, which just about hit all the lofty expectations of its enthusiast audience.

OnePlus 7 Pro back with selfie camera open

The OnePlus 7 Pro

These kinds of diverse flagships are contributing to an overall sense that Android is becoming lively again. In 2019, perhaps it wouldn’t have mattered if Samsung’s latest Galaxy S10 had been a letdown, or the Xiaomi Mi 9 was ridiculously underwhelming, because there are quality smartphones to be found everywhere.

Bad phones are back

One of the biggest giveaways the smartphone landscape is once again heating up, counter-intuitive as it may sound, is the return of bad phones.

For a while, the Android was a bit grey landscape: flagships were good and the mid-tier phones were good and all the budget phones, again, were pretty good, and they all did much the same thing. It’s only now that a few stinkers have cropped up we can appreciate how great the phones of the recent past have been.

Nokia 9 PureView Review

The Nokia 9 PureView

It’s only now that a few stinkers have cropped up we can appreciate how great the phones of the recent past have been.

The Nokia 9 PureView arrived earlier this year with a unique look and five rear cameras. I believe it was a worthwhile pursuit for HMD, but as we noted in our review, the company’s best efforts fell “frustratingly short.” In the end, its unique penta-camera was its worst feature.

We’ve also seen the “extremely average” Alcatel 3, a phone you’d probably struggle to remember if you didn’t work for an Android phone reviews site (and even then). It was shiny. It was affordable. It was also a slow, plastic, Micro-USB-toting mess running Android Oreo in the Android Pie era.

Alcatel 3 review homescreen

The Alcatel 3

LG delivered an inconsistent camera and a useless palm reader on the boring LG G8 ThinQ — its current flagship — and let’s not forget the Red Hydrogen One, which was seriously overpriced at $1,295.

HTC, meanwhile, recently released the, erm, the, err, erm, err… never mind.

Red Hydrogen One

The Red Hydrogen One

Remember, this is good! There has been a shift away from handsets you’d struggle to tell apart to distinct new smartphone flavors. We’ve seen recent phones with pop-up selfie cameras, flip cameras, sliding phones, notches that look like shark fins, and the promise of folding phones (troubled as their introduction may have been). The variety of phone designs on this short list alone is something to shout about.

There has been a shift in recent months, away from handsets you’d struggle to tell apart to distinct new smartphone flavors.

2019 is far from over

Most of the great phones on this list released in 2019. There has been a noticeable increase in originality and flair in just a few months, and that’s without even looking at the biggest and best of Android, like the Galaxy S10 or the Huawei P30 Pro. Those are arguably the best smartphones the world has ever seen, and as far as I’m concerned, they’re minor talking points.

2019 has been kind to Android phones, and the good news is we’re only halfway through. There are big things still set for the remaining quarters (Pixel 4 or Galaxy Note 10, anyone?), and I’d suggest, if the latter half is as good as the first, 2019 will be the best year Android fans have ever had.

Source:

Continue Reading
Advertisement
9 Comments

9 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Research

Part human, part machine: is Apple turning us all into cyborgs?

Published

on

By

At the beginning of the Covid-19 pandemic, Apple engineers embarked on a rare collaboration with Google. The goal was to build a system that could track individual interactions across an entire population, in an effort to get a head start on isolating potentially infectious carriers of a disease that, as the world was discovering, could be spread by asymptomatic patients.

Delivered at breakneck pace, the resulting exposure notification tool has yet to prove its worth. The NHS Covid-19 app uses it, as do others around the world. But lockdowns make interactions rare, limiting the tool’s usefulness, while in a country with uncontrolled spread, it isn’t powerful enough to keep the R number low. In the Goldilocks zone, when conditions are just right, it could save lives.

The NHS Covid-19 app has had its teething problems. It has come under fire for not working on older phones, and for its effect on battery life. But there’s one criticism that has failed to materialise: what happens if you leave home without your phone? Because who does that? The basic assumption that we can track the movement of people by tracking their phones is an accepted fact.This year has been good for tech companies, and Apple is no exception. The wave of global lockdowns has left us more reliant than ever on our devices. Despite being one of the first large companies to be seriously affected by Covid, as factory shutdowns in China hit its supply chain delaying the launch of the iPhone 12 by a month, Apple’s revenue has continued to break records. It remains the largest publicly traded company in the world by a huge margin: this year its value has grown by 50% to $2tn (£1.5tn) and it is still $400bn larger than Microsoft, the No 2.

It’s hard to think of another product that has come close to the iPhone in sheer physical proximity to our daily lives. Our spectacles, contact lenses and implanted medical devices are among the only things more personal than our phones.

Without us even noticing, Apple has turned us into organisms living symbiotically with technology: part human, part machine. We now outsource our contact books, calendars and to-do lists to devices. We no longer need to remember basic facts about the world; we can call them up on demand. But if you think that carrying around a smartphone – or wearing an Apple Watch that tracks your vitals in real time – isn’t enough to turn you into a cyborg, you may feel differently about what the company has planned next.

A pair of smartglasses, in development for a decade, could be released as soon as 2022, and would have us quite literally seeing the world through Apple’s lens – putting a digital layer between us and the world. Already, activists are worrying about the privacy concerns sparked by a camera on everyone’s face. But deeper questions, about what our relationship should be to a technology that mediates our every interaction with the world, may not even be asked until it’s too late to do anything about the answer.


The word cyborg – short for “cybernetic organism” – was coined in 1960 by Manfred E Clynes and Nathan S Kline, whose research into spaceflight prompted them to explore how incorporating mechanical components could aid in “the task of adapting man’s body to any environment he might choose”. It was a very medicalised concept: the pair imagined embedded pumps dispensing drugs automatically.

In the 1980s, genres such as cyberpunk began to express writers’ fascination with the nascent internet, and wonder how much further it could go. “It was the best we could do at the time,” laughs Bruce Sterling, a US science fiction author and futurist whose Mirrorshades anthology defined the genre for many. Ideas about putting computer chips, machine arms or chromium teeth into animals might have been very cyberpunk, Sterling says, but they didn’t really work. Such implants, he points out, aren’t “biocompatible”. Organic tissue reacts poorly, forming scar tissue, or worse, at the interface. While science fiction pursued a Matrix-style vision of metal jacks embedded in soft flesh, reality took a different path.

“If you’re looking at cyborgs in 2020,” Sterling says, “it’s in the Apple Watch. It’s already a medical monitor, it’s got all these health apps. If you really want to mess with the inside of your body, the watch lets you monitor it much better than anything else.”

The Apple Watch had a shaky start. Despite the company trying to sell it as the second coming of the iPhone, early adopters were more interested in using their new accessory as a fitness tracker than in trying to send a text message from a device far too small to fit a keyboard. So by the second iteration of the watch, Apple changed tack, leaning into the health and fitness aspect of the tech.

Now, your watch can not only measure your heart rate, but scan the electric signals in your body for evidence of arrhythmia; it can measure your blood oxygenation level, warn you if you’re in a noisy environment that could damage your hearing, and even call 999 if you fall over and don’t get up. It can also, like many consumer devices, track your running, swimming, weightlifting or dancercise activity. And, of course, it still puts your emails on your wrist, until you turn that off.

Apple believes that it can succeed where Google Glass failed.
 Apple believes that it can succeed where Google Glass failed. Illustration: Steven Gregor/The Guardian

As Sterling points out, for a vast array of health services that we would once have viewed as science fiction, there’s no need for an implanted chip in our head when an expensive watch on our wrist will do just as well.

That’s not to say that the entirety of the cyberpunk vision has been left to the world of fiction. There really are people walking around with robot limbs, after all. And even there, Apple’s influence has starkly affected what that future looks like.

“Apple, I think more than any other brand, truly cares about the user experience. And they test and test and test, and iterate and iterate and iterate. And this is what we’ve taken from them,” says Samantha Payne, the chief operating officer of Bristol’s Open Bionics. The company, which she co-founded in 2014 with CEO Joel Gibbard, makes the Hero Arm, a multi-grip bionic hand. With the rapid development of 3D printer technology, Open Bionics has managed to slash the cost of such advanced prosthetics, which could have cost almost $100,000 10 years ago, to just a few thousand dollars.

Rather than focus on flesh tones and lifelike design, Open Bionics leans into the cyborg imagery. Payne quotes one user describing it as “unapologetically bionic”. “All of the other prosthetics companies give the impression that you should be trying to hide your disability, that you need to try and fit in,” she says. “We are company that’s taking a big stance against that.”

At times, Open Bionics has been almost too successful in that goal. In November, the company launched an arm designed to look like that worn by the main character in the video game Metal Gear Solid V  red and black, shiny plastic and, yes, unapologetically bionic – and the response was unsettling. “You got loads of science fiction fans saying that they really are considering chopping off their hand,” Payne says.

Some disabled people who rely on technology to live their daily lives feel that cyberpunk imagery can exoticise the very real difficulties they face. And there are also lessons in the way that more prosaic devices can give disabled people what can only be described as superpowers. Take hearing aid users, for example: deaf iPhone owners can not only connect their hearing aids to their phones with Bluetooth, they can even set up their phone as a microphone and move it closer to the person they want to listen to, overcoming the noise of a busy restaurant or crowded lecture theatre. Bionic ears anyone?

“There’s definitely something in the idea of everyone in the world being a cyborg today,” Payne says. “A crazy high number of people in the world have a smartphone, and so all of these people are technologically augmented. It’s definitely taking it a step further when you depend on that technology to be able to perform everyday living; when it’s adorned to your body. But we are all harnessing the vast power of the internet every single day.”


Making devices so compelling that we carry them with us everywhere we go is a mixed blessing for Apple. The iPhone earns it about $150bn a year, more than every other source of revenue combined. In creating the iOS App Store, it has assumed a gatekeeper role with the power to reshape entire industries by carefully defining its terms of service. (Ever wonder why every app is asking for a subscription these days? Because of an Apple decision in 2016. Bad luck if you prefer to pay upfront for software.) But it has also opened itself up to criticism that the company allows, or even encourages, compulsive patterns of behaviour.

Apple co-founder Steve Jobs famously likened personal computers to “bicycles for the mind”, enabling people to do more work for the same amount of effort. That was true of the Macintosh computer in 1984, but modern smartphones are many times more powerful. If we now turn to them every waking hour of the day, is that because of their usefulness, or for more pernicious reasons?

“We don’t want people using their phones all the time,” Apple’s chief executive, Tim Cook, said in 2019. “We’re not motivated to do that from a business point of view, and we’re certainly not from a values point of view.” Later that year, Cook told CBS: “We made the phone to make your life better, and everybody has to decide for his or herself what that means. For me, my simple rule is if I’m looking at the device more than I’m looking into someone’s eyes, I’m doing the wrong thing.”

Apple has introduced features, such as the Screen Time setting, that help people strike that balance: users can now track, and limit, their use of individual apps, or entire categories, as they see fit. Part of the problem is that, while Apple makes the phone, it doesn’t control what people do with it. Facebook needs users to open its app daily, and Apple can only do so much to counter that tendency. If these debates – about screen time, privacy and what companies are doing with our data, our attention – seem like a niche topic of interest now, they will become crucial once Apple’s latest plans become reality. The reason is the company’s worst-kept secret in years: a pair of smartglasses.

It filed a patent in 2006 for a rudimentary version, a headset that would let users see a “peripheral light element” for an “enhanced viewing experience”, able to display notifications in the corner of your vision. That was finally granted in 2013, at the time of Google’s own attempt to convince people about smartglasses. But Google Glass failed commercially, and Apple kept quiet about its intentions in the field.

Recently, the company has intensified its focus on “augmented reality”, technology that overlays a virtual world on the real one. It’s perhaps best known through the video game Pokémon Go, which launched in 2016, superimposing Nintendo’s cute characters on parks, offices and playgrounds. However, Apple insists, it has much greater potential than simply enhancing games. Navigation apps could overlay the directions on top of the real world; shopping services could show you what you would look like wearing the clothes you’re thinking of getting; architects could walk around inside the spaces they have designed before shovels even break ground.

Smartglasses could leave us quite literally seeing the world through Apple’s lens.
 Smartglasses could leave us quite literally seeing the world through Apple’s lens. Illustration: Steven Gregor/The Guardian

With each new iPhone launch, Apple’s demonstrated new breakthroughs in the technology, such as “Lidar” support in new iPhones and iPads, a tech (think radar with lasers) that lets them accurately measure the physical space they are in. Then, at the end of 2019, it all slotted into place: a Bloomberg report suggested that the company hadn’t given up on smartglasses in the wake of Google Glass’s failure, but had spent five years honing the concept. The pandemic put paid to a target of getting hardware on the shelves in 2020, but the company is still hoping to make an announcement next year for a 2022 launch, Bloomberg suggested.

Apple’s plans cover two devices, codenamed N301 and N421. The former is designed to feature “ultra-high-resolution screens that will make it almost impossible for a user to differentiate the virtual world from the real one”, according to Bloomberg’s Mark Gurman. This is a product with an appeal far beyond the hardcore gamers who have adopted existing VR headsets: you might put it on to enjoy lifelike, immersive entertainment, or to do creative work that can make the most of the technology, but would probably take it off to have lunch, for instance.

N421 is where the real ambitions lie. Expected in 2023, it’s described only as “a lightweight pair of glasses using AR”. But, argues Mark Pesce in his book Augmented Reality, this would be the culmination of the “mirrorshades” dreamed up by the cyberpunks in the 80s, using the iPhone as the brains of the device and “keeping the displays themselves light and comfortable”. Wearing it all day, every day, the idea of a world without a digital layer between you and reality would eventually fade into memory – just as living without immediate access to the internet has for so many right now.

Apple isn’t the first to try to build such a device, says Rupantar Guha of the analysts GlobalData, who has been following the trend in smartglasses from a business standpoint for years, but it could lead the wave that makes it relevant. “The public perception of smartglasses has struggled to recover from the high-profile failure of Google Glass, but big tech still sees potential in the technology.” Guha cites the recent launch of Amazon Echo Frames – sunglasses you can talk to, because they have got the Alexa digital assistant built in – and Google’s purchase of the smartglasses maker North in June 2020. “Apple and Facebook are planning to launch consumer smartglasses over the next two years, and will expect to succeed where their predecessors could not,” Guha adds.

If Apple pulls off that launch, then the cyberpunk – and cyborg – future will have arrived. It’s not hard to imagine the concerns, as cultural questions clash with technological: should kids take off their glasses in the classroom, just as we now require them to keep phones in their lockers? Will we need to carve out lens-free time in our evenings to enjoy old-fashioned, healthy activities such as watching TV or playing video games?

“It’s a fool’s errand to imagine every use of AR before we have the hardware in our hands,” writes the developer Adrian Hon, who was called on by Google to write games for their smartglasses a decade ago. “Yet there’s one use of AR glasses that few are talking about but will be world-changing: scraping data from everything we see.” This “worldscraping” would be a big tech dream – and a privacy activist’s nightmare. A pair of smartglasses turns people into walking CCTV cameras, and the data that a canny company could gather from that is mindboggling. Every time someone browsed a supermarket, their smartglasses would be recording real-time pricing data, stock levels and browsing habits; every time they opened up a newspaper, their glasses would know which stories they read, which adverts they looked at and which celebrity beach pictures their gaze lingered on.

“We won’t be able to opt out from wearing AR glasses in 2035 any more than we can opt out of owning smartphones today,” Hon writes. “Billions have no choice but to use them for basic tasks like education, banking, communication and accessing government services. In just a few years time, AR glasses do the same, but faster and better.”

Apple would argue that, if any company is to control such a powerful technology, it ought to. The company declined to speak on the record for this story, but it has invested time and money in making the case that it can be trusted not to abuse its power. The company points to its comparatively simple business model: make things, and sell them for a lot of money. It isn’t Google or Facebook, trying to monetise personal data, or Amazon, trying to replace the high street – it’s just a company that happens to make a £1,000 phone that it can sell to 150 million people a year.

But whether we trust Apple might be beside the point, if we don’t yet know whether we can trust ourselves. It took eight years from the launch of the iPhone for screen time controls to follow. What will human interaction look like eight years after smartglasses become ubiquitous? Our cyborg present sneaked up on us as our phones became glued to our hands. Are we going to sleepwalk into our cyborg future in the same way?

Source: https://www.theguardian.com/technology/2020/nov/25/part-human-part-machine-is-apple-turning-us-all-into-cyborgs

Continue Reading

Research

Researchers hacked a robotic vacuum cleaner to record speech and music remotely

Published

on

By

A team of researchers demonstrated that popular robotic household vacuum cleaners can be remotely hacked to act as microphones.

The researchers—including Nirupam Roy, an assistant professor in the University of Maryland’s Department of Computer Science—collected information from the laser-based  in a popular vacuum robot and applied  and deep learning techniques to recover speech and identify  playing in the same room as the device.

The research demonstrates the potential for any device that uses light detection and ranging (Lidar) technology to be manipulated for collecting , despite not having a microphone. This work, which is a collaboration with assistant professor Jun Han at the University of Singapore was presented at the Association for Computing Machinery’s Conference on Embedded Networked Sensor Systems (SenSys 2020) on November 18, 2020.

“We welcome these devices into our homes, and we don’t think anything about it,” said Roy, who holds a joint appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS). “But we have shown that even though these devices don’t have microphones, we can repurpose the systems they use for navigation to spy on conversations and potentially reveal .”

The Lidar navigation systems in household vacuum bots shine a laser beam around a room and sense the reflection of the laser as it bounces off nearby objects. The robot uses the reflected signals to map the room and avoid collisions as it moves through the house.

Privacy experts have suggested that the maps made by vacuum bots, which are often stored in the cloud, pose potential privacy breaches that could give advertisers access to information about such things as home size, which suggests income level, and other lifestyle-related information. Roy and his team wondered if the Lidar in these robots could also pose potential security risks as sound recording devices in users’ homes or businesses.

Sound waves cause objects to vibrate, and these vibrations cause slight variations in the light bouncing off an object. Laser microphones, used in espionage since the 1940s, are capable of converting those variations back into sound waves. But laser microphones rely on a targeted laser beam reflecting off very smooth surfaces, such as glass windows.

Could your vacuum be listening to you?
Deep learning algorithms were able to interpret scattered sound waves, such those above that were captured by a robot vacuum, to identify numbers and musical sequences. Credit: Sriram Sami

A vacuum Lidar, on the other hand, scans the environment with a laser and senses the light scattered back by objects that are irregular in shape and density. The scattered signal received by the vacuum’s sensor provides only a fraction of the information needed to recover sound waves. The researchers were unsure if a vacuum bot’s Lidar system could be manipulated to function as a microphone and if the signal could be interpreted into meaningful sound signals.

First, the researchers hacked a robot vacuum to show they could control the position of the  and send the sensed data to their laptops through Wi-Fi without interfering with the device’s navigation.

Next, they conducted experiments with two sound sources. One source was a human voice reciting numbers played over computer speakers and the other was audio from a variety of television shows played through a TV sound bar. Roy and his colleagues then captured the laser signal sensed by the vacuum’s navigation system as it bounced off a variety of objects placed near the sound source. Objects included a trash can, cardboard box, takeout container and polypropylene bag—items that might normally be found on a typical floor.

The researchers passed the signals they received through deep learning algorithms that were trained to either match human voices or to identify musical sequences from television shows. Their computer system, which they call LidarPhone, identified and matched spoken numbers with 90% accuracy. It also identified television shows from a minute’s worth of recording with more than 90% accuracy.

“This type of threat may be more important now than ever, when you consider that we are all ordering food over the phone and having meetings over the computer, and we are often speaking our credit card or bank information,” Roy said. “But what is even more concerning for me is that it can reveal much more personal information. This kind of information can tell you about my living style, how many hours I’m working, other things that I am doing. And what we watch on TV can reveal our political orientations. That is crucial for someone who might want to manipulate the political elections or target very specific messages to me.”

The researchers emphasize that vacuum cleaners are just one example of potential vulnerability to Lidar-based spying. Many other devices could be open to similar attacks such as smartphone infrared sensors used for face recognition or passive infrared sensors used for motion detection.

“I believe this is significant work that will make the manufacturers aware of these possibilities and trigger the security and privacy community to come up with solutions to prevent these kinds of attacks,” Roy said.

Source: https://techxplore.com/news/2020-11-hacked-robotic-vacuum-cleaner-speech.html

Continue Reading

Research

iPhone 12 Pro Max receives ‘highest ever’ rating from DisplayMate, sets 11 records

Published

on

By

DisplayMate has put the iPhone 12 Pro Max Super Retina XDR display through its highly detailed testing and the outcome isn’t surprising: Apple has once again earned “DisplayMate’s highest ever Display Performance Grade of A+” and “Best Smartphone Display Award.” However, going beyond the iPhone 11 Pro’s accolades last year, the 12 Pro Max has matched or set 11 smartphone display performance records.

DisplayMate just published its deep dive review of the iPhone 12 Pro Max display. As has become a tradition, this year’s iPhone has garnered another highest ever A+ rating from the firm but more notably it has hit a milestone for how many new records it’s broken or matched: 11. For comparison, the iPhone 11 Pro matched or set 9 display performance records last year and did the same for 8 with the iPhone XS Max in 2018.

Here are the smartphone display records that DisplayMate says the iPhone 12 Pro Max has set/matched:

Note that Numerical Performance Differences that are Visually Indistinguishable are considered Matched and Tied Performance Records.

· Highest   Absolute Color Accuracy  (0.9 JNCD)  –  Visually Indistinguishable From Perfect.

· Highest   Image Contrast Accuracy and Intensity Scale Accuracy  (2.19 Gamma)  –  Visually Indistinguishable From Perfect.

· Smallest  Shift in Color Accuracy and Intensity Scale with the Image Content APL  (0.2 JNCD)  –  Visually Indistinguishable From Perfect.

· Smallest  Shift in Image Contrast and Intensity Scale with the Image Content APL  (0.00 Gamma)  –  Visually Indistinguishable From Perfect.

· Smallest  Change in Peak Luminance with the Image Content Average Picture Level APL  (1 percent)  –  Visually Indistinguishable From Perfect.

· Highest   Full Screen Brightness for OLED Smartphones  (825 nits for 100% APL).

· Highest   Full Screen Contrast Rating in Ambient Light  (172 at 100% APL).

· Highest   Contrast Ratio  (Infinite).

· Lowest    Screen Reflectance  (4.8 percent).

· Smallest  Brightness Variation with Viewing Angle  (27% at 30 degrees).

· Highest   Visible Screen Resolution 2.8K (2778×1284)  –  4K Does Not appear visually sharper on a Smartphone.

DisplayMate also touches on the iPhone 12 lineup featuring a 60Hz display instead of an upgraded 120Hz one like on the iPad Pro. It concludes that it “should be fine for most applications.”

The iPhone 12 Pro Max display has the standard 60 Hz Refresh Rate, rather than the higher 90 Hz and 120 Hz Refresh Rates now being introduced. With the very fast Response Time of the OLED display, and the very fast CPU and GPU processors on the iPhone 12 Pro Max, the lower 60 Hz Refresh Rate should be fine for most applications.

For a detailed look at all the ways DisplayMate tested Apple’s latest state-of-the-art iPhone display, check out the full report here.

Source: https://9to5mac.com/2020/11/17/iphone-12-pro-max-receives-highest-ever-rating-from-displaymate-sets-11-records/

Continue Reading
Advertisement

Trending

Copyright © 2020 Inventrium Magazine

%d bloggers like this: