Connect with us

Research

It’s not just you, Google Drive is down for some today

Published

on

Google Drive is an excellent online service for storing files and creating documents with Docs, Sheets, and other apps. However, Google Drive service is having a bit of a hiccup today as some users report the site down.

First reported widely on DownDetector and since officially confirmed on the Google Status Dashboard, Drive and its apps are down or disrupted for many users today. Whatever’s going on with Google’s service, this outage doesn’t seem widespread by any means. No one on the 9to5Google team has been able to see the service down, but there’s a huge spike in issues reported around the web. At the time of writing, issues starting popping up widely about 25 minutes ago.

In some cases, Google Drive went down for just a matter of minutes before service was restored and reports are now coming in from other regions. Whatever the problem, it seems this is a quick fix that Google is actively working on in the background. The majority of users experiencing issues are located in the United States.

On its official Twitter account, Google further confirmed the “service disruption” to a school district that took to Twitter complaining of issues.

Update: Some users are still experiencing issues with Google Drive, but just a few hours after it started things seem to be patched up! Google confirmed that the issue was resolved in the background too.

The problem with Google Drive should be resolved. We apologize for the inconvenience and thank you for your patience and continued support. Please rest assured that system reliability is a top priority at Google, and we are making continuous improvements to make our systems better.

Source:
https://9to5google.com/2020/01/27/google-drive-down-today/

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Research

Part human, part machine: is Apple turning us all into cyborgs?

Published

on

By

At the beginning of the Covid-19 pandemic, Apple engineers embarked on a rare collaboration with Google. The goal was to build a system that could track individual interactions across an entire population, in an effort to get a head start on isolating potentially infectious carriers of a disease that, as the world was discovering, could be spread by asymptomatic patients.

Delivered at breakneck pace, the resulting exposure notification tool has yet to prove its worth. The NHS Covid-19 app uses it, as do others around the world. But lockdowns make interactions rare, limiting the tool’s usefulness, while in a country with uncontrolled spread, it isn’t powerful enough to keep the R number low. In the Goldilocks zone, when conditions are just right, it could save lives.

The NHS Covid-19 app has had its teething problems. It has come under fire for not working on older phones, and for its effect on battery life. But there’s one criticism that has failed to materialise: what happens if you leave home without your phone? Because who does that? The basic assumption that we can track the movement of people by tracking their phones is an accepted fact.This year has been good for tech companies, and Apple is no exception. The wave of global lockdowns has left us more reliant than ever on our devices. Despite being one of the first large companies to be seriously affected by Covid, as factory shutdowns in China hit its supply chain delaying the launch of the iPhone 12 by a month, Apple’s revenue has continued to break records. It remains the largest publicly traded company in the world by a huge margin: this year its value has grown by 50% to $2tn (£1.5tn) and it is still $400bn larger than Microsoft, the No 2.

It’s hard to think of another product that has come close to the iPhone in sheer physical proximity to our daily lives. Our spectacles, contact lenses and implanted medical devices are among the only things more personal than our phones.

Without us even noticing, Apple has turned us into organisms living symbiotically with technology: part human, part machine. We now outsource our contact books, calendars and to-do lists to devices. We no longer need to remember basic facts about the world; we can call them up on demand. But if you think that carrying around a smartphone – or wearing an Apple Watch that tracks your vitals in real time – isn’t enough to turn you into a cyborg, you may feel differently about what the company has planned next.

A pair of smartglasses, in development for a decade, could be released as soon as 2022, and would have us quite literally seeing the world through Apple’s lens – putting a digital layer between us and the world. Already, activists are worrying about the privacy concerns sparked by a camera on everyone’s face. But deeper questions, about what our relationship should be to a technology that mediates our every interaction with the world, may not even be asked until it’s too late to do anything about the answer.


The word cyborg – short for “cybernetic organism” – was coined in 1960 by Manfred E Clynes and Nathan S Kline, whose research into spaceflight prompted them to explore how incorporating mechanical components could aid in “the task of adapting man’s body to any environment he might choose”. It was a very medicalised concept: the pair imagined embedded pumps dispensing drugs automatically.

In the 1980s, genres such as cyberpunk began to express writers’ fascination with the nascent internet, and wonder how much further it could go. “It was the best we could do at the time,” laughs Bruce Sterling, a US science fiction author and futurist whose Mirrorshades anthology defined the genre for many. Ideas about putting computer chips, machine arms or chromium teeth into animals might have been very cyberpunk, Sterling says, but they didn’t really work. Such implants, he points out, aren’t “biocompatible”. Organic tissue reacts poorly, forming scar tissue, or worse, at the interface. While science fiction pursued a Matrix-style vision of metal jacks embedded in soft flesh, reality took a different path.

“If you’re looking at cyborgs in 2020,” Sterling says, “it’s in the Apple Watch. It’s already a medical monitor, it’s got all these health apps. If you really want to mess with the inside of your body, the watch lets you monitor it much better than anything else.”

The Apple Watch had a shaky start. Despite the company trying to sell it as the second coming of the iPhone, early adopters were more interested in using their new accessory as a fitness tracker than in trying to send a text message from a device far too small to fit a keyboard. So by the second iteration of the watch, Apple changed tack, leaning into the health and fitness aspect of the tech.

Now, your watch can not only measure your heart rate, but scan the electric signals in your body for evidence of arrhythmia; it can measure your blood oxygenation level, warn you if you’re in a noisy environment that could damage your hearing, and even call 999 if you fall over and don’t get up. It can also, like many consumer devices, track your running, swimming, weightlifting or dancercise activity. And, of course, it still puts your emails on your wrist, until you turn that off.

Apple believes that it can succeed where Google Glass failed.
 Apple believes that it can succeed where Google Glass failed. Illustration: Steven Gregor/The Guardian

As Sterling points out, for a vast array of health services that we would once have viewed as science fiction, there’s no need for an implanted chip in our head when an expensive watch on our wrist will do just as well.

That’s not to say that the entirety of the cyberpunk vision has been left to the world of fiction. There really are people walking around with robot limbs, after all. And even there, Apple’s influence has starkly affected what that future looks like.

“Apple, I think more than any other brand, truly cares about the user experience. And they test and test and test, and iterate and iterate and iterate. And this is what we’ve taken from them,” says Samantha Payne, the chief operating officer of Bristol’s Open Bionics. The company, which she co-founded in 2014 with CEO Joel Gibbard, makes the Hero Arm, a multi-grip bionic hand. With the rapid development of 3D printer technology, Open Bionics has managed to slash the cost of such advanced prosthetics, which could have cost almost $100,000 10 years ago, to just a few thousand dollars.

Rather than focus on flesh tones and lifelike design, Open Bionics leans into the cyborg imagery. Payne quotes one user describing it as “unapologetically bionic”. “All of the other prosthetics companies give the impression that you should be trying to hide your disability, that you need to try and fit in,” she says. “We are company that’s taking a big stance against that.”

At times, Open Bionics has been almost too successful in that goal. In November, the company launched an arm designed to look like that worn by the main character in the video game Metal Gear Solid V  red and black, shiny plastic and, yes, unapologetically bionic – and the response was unsettling. “You got loads of science fiction fans saying that they really are considering chopping off their hand,” Payne says.

Some disabled people who rely on technology to live their daily lives feel that cyberpunk imagery can exoticise the very real difficulties they face. And there are also lessons in the way that more prosaic devices can give disabled people what can only be described as superpowers. Take hearing aid users, for example: deaf iPhone owners can not only connect their hearing aids to their phones with Bluetooth, they can even set up their phone as a microphone and move it closer to the person they want to listen to, overcoming the noise of a busy restaurant or crowded lecture theatre. Bionic ears anyone?

“There’s definitely something in the idea of everyone in the world being a cyborg today,” Payne says. “A crazy high number of people in the world have a smartphone, and so all of these people are technologically augmented. It’s definitely taking it a step further when you depend on that technology to be able to perform everyday living; when it’s adorned to your body. But we are all harnessing the vast power of the internet every single day.”


Making devices so compelling that we carry them with us everywhere we go is a mixed blessing for Apple. The iPhone earns it about $150bn a year, more than every other source of revenue combined. In creating the iOS App Store, it has assumed a gatekeeper role with the power to reshape entire industries by carefully defining its terms of service. (Ever wonder why every app is asking for a subscription these days? Because of an Apple decision in 2016. Bad luck if you prefer to pay upfront for software.) But it has also opened itself up to criticism that the company allows, or even encourages, compulsive patterns of behaviour.

Apple co-founder Steve Jobs famously likened personal computers to “bicycles for the mind”, enabling people to do more work for the same amount of effort. That was true of the Macintosh computer in 1984, but modern smartphones are many times more powerful. If we now turn to them every waking hour of the day, is that because of their usefulness, or for more pernicious reasons?

“We don’t want people using their phones all the time,” Apple’s chief executive, Tim Cook, said in 2019. “We’re not motivated to do that from a business point of view, and we’re certainly not from a values point of view.” Later that year, Cook told CBS: “We made the phone to make your life better, and everybody has to decide for his or herself what that means. For me, my simple rule is if I’m looking at the device more than I’m looking into someone’s eyes, I’m doing the wrong thing.”

Apple has introduced features, such as the Screen Time setting, that help people strike that balance: users can now track, and limit, their use of individual apps, or entire categories, as they see fit. Part of the problem is that, while Apple makes the phone, it doesn’t control what people do with it. Facebook needs users to open its app daily, and Apple can only do so much to counter that tendency. If these debates – about screen time, privacy and what companies are doing with our data, our attention – seem like a niche topic of interest now, they will become crucial once Apple’s latest plans become reality. The reason is the company’s worst-kept secret in years: a pair of smartglasses.

It filed a patent in 2006 for a rudimentary version, a headset that would let users see a “peripheral light element” for an “enhanced viewing experience”, able to display notifications in the corner of your vision. That was finally granted in 2013, at the time of Google’s own attempt to convince people about smartglasses. But Google Glass failed commercially, and Apple kept quiet about its intentions in the field.

Recently, the company has intensified its focus on “augmented reality”, technology that overlays a virtual world on the real one. It’s perhaps best known through the video game Pokémon Go, which launched in 2016, superimposing Nintendo’s cute characters on parks, offices and playgrounds. However, Apple insists, it has much greater potential than simply enhancing games. Navigation apps could overlay the directions on top of the real world; shopping services could show you what you would look like wearing the clothes you’re thinking of getting; architects could walk around inside the spaces they have designed before shovels even break ground.

Smartglasses could leave us quite literally seeing the world through Apple’s lens.
 Smartglasses could leave us quite literally seeing the world through Apple’s lens. Illustration: Steven Gregor/The Guardian

With each new iPhone launch, Apple’s demonstrated new breakthroughs in the technology, such as “Lidar” support in new iPhones and iPads, a tech (think radar with lasers) that lets them accurately measure the physical space they are in. Then, at the end of 2019, it all slotted into place: a Bloomberg report suggested that the company hadn’t given up on smartglasses in the wake of Google Glass’s failure, but had spent five years honing the concept. The pandemic put paid to a target of getting hardware on the shelves in 2020, but the company is still hoping to make an announcement next year for a 2022 launch, Bloomberg suggested.

Apple’s plans cover two devices, codenamed N301 and N421. The former is designed to feature “ultra-high-resolution screens that will make it almost impossible for a user to differentiate the virtual world from the real one”, according to Bloomberg’s Mark Gurman. This is a product with an appeal far beyond the hardcore gamers who have adopted existing VR headsets: you might put it on to enjoy lifelike, immersive entertainment, or to do creative work that can make the most of the technology, but would probably take it off to have lunch, for instance.

N421 is where the real ambitions lie. Expected in 2023, it’s described only as “a lightweight pair of glasses using AR”. But, argues Mark Pesce in his book Augmented Reality, this would be the culmination of the “mirrorshades” dreamed up by the cyberpunks in the 80s, using the iPhone as the brains of the device and “keeping the displays themselves light and comfortable”. Wearing it all day, every day, the idea of a world without a digital layer between you and reality would eventually fade into memory – just as living without immediate access to the internet has for so many right now.

Apple isn’t the first to try to build such a device, says Rupantar Guha of the analysts GlobalData, who has been following the trend in smartglasses from a business standpoint for years, but it could lead the wave that makes it relevant. “The public perception of smartglasses has struggled to recover from the high-profile failure of Google Glass, but big tech still sees potential in the technology.” Guha cites the recent launch of Amazon Echo Frames – sunglasses you can talk to, because they have got the Alexa digital assistant built in – and Google’s purchase of the smartglasses maker North in June 2020. “Apple and Facebook are planning to launch consumer smartglasses over the next two years, and will expect to succeed where their predecessors could not,” Guha adds.

If Apple pulls off that launch, then the cyberpunk – and cyborg – future will have arrived. It’s not hard to imagine the concerns, as cultural questions clash with technological: should kids take off their glasses in the classroom, just as we now require them to keep phones in their lockers? Will we need to carve out lens-free time in our evenings to enjoy old-fashioned, healthy activities such as watching TV or playing video games?

“It’s a fool’s errand to imagine every use of AR before we have the hardware in our hands,” writes the developer Adrian Hon, who was called on by Google to write games for their smartglasses a decade ago. “Yet there’s one use of AR glasses that few are talking about but will be world-changing: scraping data from everything we see.” This “worldscraping” would be a big tech dream – and a privacy activist’s nightmare. A pair of smartglasses turns people into walking CCTV cameras, and the data that a canny company could gather from that is mindboggling. Every time someone browsed a supermarket, their smartglasses would be recording real-time pricing data, stock levels and browsing habits; every time they opened up a newspaper, their glasses would know which stories they read, which adverts they looked at and which celebrity beach pictures their gaze lingered on.

“We won’t be able to opt out from wearing AR glasses in 2035 any more than we can opt out of owning smartphones today,” Hon writes. “Billions have no choice but to use them for basic tasks like education, banking, communication and accessing government services. In just a few years time, AR glasses do the same, but faster and better.”

Apple would argue that, if any company is to control such a powerful technology, it ought to. The company declined to speak on the record for this story, but it has invested time and money in making the case that it can be trusted not to abuse its power. The company points to its comparatively simple business model: make things, and sell them for a lot of money. It isn’t Google or Facebook, trying to monetise personal data, or Amazon, trying to replace the high street – it’s just a company that happens to make a £1,000 phone that it can sell to 150 million people a year.

But whether we trust Apple might be beside the point, if we don’t yet know whether we can trust ourselves. It took eight years from the launch of the iPhone for screen time controls to follow. What will human interaction look like eight years after smartglasses become ubiquitous? Our cyborg present sneaked up on us as our phones became glued to our hands. Are we going to sleepwalk into our cyborg future in the same way?

Source: https://www.theguardian.com/technology/2020/nov/25/part-human-part-machine-is-apple-turning-us-all-into-cyborgs

Continue Reading

Research

Researchers hacked a robotic vacuum cleaner to record speech and music remotely

Published

on

By

A team of researchers demonstrated that popular robotic household vacuum cleaners can be remotely hacked to act as microphones.

The researchers—including Nirupam Roy, an assistant professor in the University of Maryland’s Department of Computer Science—collected information from the laser-based  in a popular vacuum robot and applied  and deep learning techniques to recover speech and identify  playing in the same room as the device.

The research demonstrates the potential for any device that uses light detection and ranging (Lidar) technology to be manipulated for collecting , despite not having a microphone. This work, which is a collaboration with assistant professor Jun Han at the University of Singapore was presented at the Association for Computing Machinery’s Conference on Embedded Networked Sensor Systems (SenSys 2020) on November 18, 2020.

“We welcome these devices into our homes, and we don’t think anything about it,” said Roy, who holds a joint appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS). “But we have shown that even though these devices don’t have microphones, we can repurpose the systems they use for navigation to spy on conversations and potentially reveal .”

The Lidar navigation systems in household vacuum bots shine a laser beam around a room and sense the reflection of the laser as it bounces off nearby objects. The robot uses the reflected signals to map the room and avoid collisions as it moves through the house.

Privacy experts have suggested that the maps made by vacuum bots, which are often stored in the cloud, pose potential privacy breaches that could give advertisers access to information about such things as home size, which suggests income level, and other lifestyle-related information. Roy and his team wondered if the Lidar in these robots could also pose potential security risks as sound recording devices in users’ homes or businesses.

Sound waves cause objects to vibrate, and these vibrations cause slight variations in the light bouncing off an object. Laser microphones, used in espionage since the 1940s, are capable of converting those variations back into sound waves. But laser microphones rely on a targeted laser beam reflecting off very smooth surfaces, such as glass windows.

Could your vacuum be listening to you?
Deep learning algorithms were able to interpret scattered sound waves, such those above that were captured by a robot vacuum, to identify numbers and musical sequences. Credit: Sriram Sami

A vacuum Lidar, on the other hand, scans the environment with a laser and senses the light scattered back by objects that are irregular in shape and density. The scattered signal received by the vacuum’s sensor provides only a fraction of the information needed to recover sound waves. The researchers were unsure if a vacuum bot’s Lidar system could be manipulated to function as a microphone and if the signal could be interpreted into meaningful sound signals.

First, the researchers hacked a robot vacuum to show they could control the position of the  and send the sensed data to their laptops through Wi-Fi without interfering with the device’s navigation.

Next, they conducted experiments with two sound sources. One source was a human voice reciting numbers played over computer speakers and the other was audio from a variety of television shows played through a TV sound bar. Roy and his colleagues then captured the laser signal sensed by the vacuum’s navigation system as it bounced off a variety of objects placed near the sound source. Objects included a trash can, cardboard box, takeout container and polypropylene bag—items that might normally be found on a typical floor.

The researchers passed the signals they received through deep learning algorithms that were trained to either match human voices or to identify musical sequences from television shows. Their computer system, which they call LidarPhone, identified and matched spoken numbers with 90% accuracy. It also identified television shows from a minute’s worth of recording with more than 90% accuracy.

“This type of threat may be more important now than ever, when you consider that we are all ordering food over the phone and having meetings over the computer, and we are often speaking our credit card or bank information,” Roy said. “But what is even more concerning for me is that it can reveal much more personal information. This kind of information can tell you about my living style, how many hours I’m working, other things that I am doing. And what we watch on TV can reveal our political orientations. That is crucial for someone who might want to manipulate the political elections or target very specific messages to me.”

The researchers emphasize that vacuum cleaners are just one example of potential vulnerability to Lidar-based spying. Many other devices could be open to similar attacks such as smartphone infrared sensors used for face recognition or passive infrared sensors used for motion detection.

“I believe this is significant work that will make the manufacturers aware of these possibilities and trigger the security and privacy community to come up with solutions to prevent these kinds of attacks,” Roy said.

Source: https://techxplore.com/news/2020-11-hacked-robotic-vacuum-cleaner-speech.html

Continue Reading

Research

iPhone 12 Pro Max receives ‘highest ever’ rating from DisplayMate, sets 11 records

Published

on

By

DisplayMate has put the iPhone 12 Pro Max Super Retina XDR display through its highly detailed testing and the outcome isn’t surprising: Apple has once again earned “DisplayMate’s highest ever Display Performance Grade of A+” and “Best Smartphone Display Award.” However, going beyond the iPhone 11 Pro’s accolades last year, the 12 Pro Max has matched or set 11 smartphone display performance records.

DisplayMate just published its deep dive review of the iPhone 12 Pro Max display. As has become a tradition, this year’s iPhone has garnered another highest ever A+ rating from the firm but more notably it has hit a milestone for how many new records it’s broken or matched: 11. For comparison, the iPhone 11 Pro matched or set 9 display performance records last year and did the same for 8 with the iPhone XS Max in 2018.

Here are the smartphone display records that DisplayMate says the iPhone 12 Pro Max has set/matched:

Note that Numerical Performance Differences that are Visually Indistinguishable are considered Matched and Tied Performance Records.

· Highest   Absolute Color Accuracy  (0.9 JNCD)  –  Visually Indistinguishable From Perfect.

· Highest   Image Contrast Accuracy and Intensity Scale Accuracy  (2.19 Gamma)  –  Visually Indistinguishable From Perfect.

· Smallest  Shift in Color Accuracy and Intensity Scale with the Image Content APL  (0.2 JNCD)  –  Visually Indistinguishable From Perfect.

· Smallest  Shift in Image Contrast and Intensity Scale with the Image Content APL  (0.00 Gamma)  –  Visually Indistinguishable From Perfect.

· Smallest  Change in Peak Luminance with the Image Content Average Picture Level APL  (1 percent)  –  Visually Indistinguishable From Perfect.

· Highest   Full Screen Brightness for OLED Smartphones  (825 nits for 100% APL).

· Highest   Full Screen Contrast Rating in Ambient Light  (172 at 100% APL).

· Highest   Contrast Ratio  (Infinite).

· Lowest    Screen Reflectance  (4.8 percent).

· Smallest  Brightness Variation with Viewing Angle  (27% at 30 degrees).

· Highest   Visible Screen Resolution 2.8K (2778×1284)  –  4K Does Not appear visually sharper on a Smartphone.

DisplayMate also touches on the iPhone 12 lineup featuring a 60Hz display instead of an upgraded 120Hz one like on the iPad Pro. It concludes that it “should be fine for most applications.”

The iPhone 12 Pro Max display has the standard 60 Hz Refresh Rate, rather than the higher 90 Hz and 120 Hz Refresh Rates now being introduced. With the very fast Response Time of the OLED display, and the very fast CPU and GPU processors on the iPhone 12 Pro Max, the lower 60 Hz Refresh Rate should be fine for most applications.

For a detailed look at all the ways DisplayMate tested Apple’s latest state-of-the-art iPhone display, check out the full report here.

Source: https://9to5mac.com/2020/11/17/iphone-12-pro-max-receives-highest-ever-rating-from-displaymate-sets-11-records/

Continue Reading
Advertisement

Trending

Copyright © 2020 Inventrium Magazine

%d bloggers like this: