Connect with us


Apple’s new Music and Video apps leak



Apple has long been rumoured to be splitting up its iTunes app into separate apps mirroring the way the firm handles those apps on iOS.

Now, we have visual confirmation that Apple will, in fact, be doing this and the odds are on the firm announcing this move at WWDC next week as per this image from 9to5Mac.

Apple apparently won’t be using its Marzipan technology for these apps, so they will be native Mac apps with all that entail. As per 9to5Mac, they’ll be based on iTunes code and feature a similar design language but with different coloured sidebars.

Both apps will sync content to your iPhones, iPods and iPad and will function very similar albeit with library differences. The Music app will integrate local music, streamed music and purchased music in one library view while the Video app will separate purchased content from local content (not unlike how Microsoft and Google handled their respective offerings.)

The main reason for these apps is that iTunes has steadily been growing bloated with the app become a haven to features that were myriad and unrelated to one another on the face of it. With this new change, perhaps Apple’s engineers can focus on making each individual app the best app it’s supposed to be.


Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.


Apple’s Home app takes a step back in iOS 13




Apple is upgrading HomeKit this fall with new features like Secure Video and expanded automation, but it’s not all good news for Apple’s smart home framework. The Home app where users manage the smart home experience makes one design choice that is likely meant to make it more approachable. In practice, the change degrades the experience for a whole category of HomeKit products.

Difficulty managing the Home app at a glance when more than a few accessories are added has been a common complaint about the app’s design from the start. In iOS 13, Apple appears to be addressing this feedback with a change to how individual products with multiple accessories are presented.

In iOS 12, each accessory is presented as its own tile even if it’s part of a single product. This can result in a single HomeKit product populating the Home app with a half dozen tiles.

Apple’s Home app treats products with multiple accessories different in iOS 13. One product is one tile even if it includes two or more accessories. Fewer tiles means more usable at a glance, right?

Not so fast. Here’s where it becomes a step back for products like this. The awesome Eve Degree sensor is a single HomeKit product that measures both temperature and humidity. Summer can be brutal on the Mississippi Gulf Coast where I live, so I really like knowing exactly how hot and how humid it is at my front door.

That information is glanceable in iOS 12 where each measurement is presented as its own sensor accessory:

This is how iOS 13 presents the same accessory:

You can still expand the tile to show more information — in this case any information — but the tile for this accessory includes no updatable information.

Here’s another example using the indoor Eve Room product. This is iOS 12:

And this is the new approach starting with iOS 13:

Again, any useful information has been grouped into a single tile with no sensor data. Glanceable data is gone, requiring an interaction to actually see what those sensors are presenting:

While the change is disappointing for sensors, it’s downright confusing for other types of HomeKit products that include more than one accessory.

For example, the useful VOCOlinc Flowerbud diffuser and mood lamp is a single product with two very different HomeKit accessories. iOS 12 lets you easily view the status and control each accessory with individual tiles:

Here’s that same product squeezed into a single tile in iOS 13:

I realize the examples are becoming exhausting, but that just illustrates the disappointing experience in iOS 13’s Home app. This change even affects HomeKit power strips like VOCOlinc PM2E which can include plugs, lights, and fans as assigned by the user in the Home app.

iOS 12:

And iOS 13:

The worst example is saved for last: HomeKit camera sensors. The Arlo Baby Cam is primarily used as a video camera with sound input and output. A firmware update to the camera later added support for five additional accessories using the product’s built-in sensors.

Five more sensors is an awesome improvement to an already great HomeKit product — at least in iOS 12:

If you want to read the same sensor data in iOS 13, simply tap on the camera tile, view the live video feed, find the button for other accessories in that room, then bam, you’ve managed to do something that was zero steps in iOS 12.

Glanceable information isn’t the only benefit lost. Discoverability is a casualty too.

This design choice isn’t just a glitchy software bug found in a beta version of Apple’s software. It’s an intentional design change that required developing for this update. It hasn’t improved as the rest of the operating system has started to stabilize; it’s only grown worse.

Why did Apple make this change? The idea is logical. One product should have one tile that can be expanded to show every included accessory. It reduces the tile count and clears up which accessories belong to which products in your home.

The current answer being tested in each iOS 13 beta so far doesn’t appear to be the correct solution however. Glanceable information is removed and very different types of accessories are squeezed together.

Apple’s Home app already has a solution to the too-many-tiles problem. Users have control over which tiles appear on the Favorites section of the Home app. You can show as many or as few as you choose.

Accessories are only unpacked when you view a specific room, and even then you only see which accessories are located in that room. iOS 13 even improves this by moving bridges that offer no data or control to a different part of the Home app.

The Home app also has a mechanism for grouping accessories — if you choose — which is useful for turning multiple bulbs into a single lamp. iOS 13 could learn from that approach with a toggle to unpack accessories grouped under a single tile. The default can be bad as long as there’s a logical way out.


Continue Reading


Microsoft Will Pay Hackers $30,000 For Finding Flaws In The New Edge Browser




Today, Microsoft released the first beta of the all-new Edge browser. The company is now inviting anyone who wants to get a sneak peek to download the app… and inviting hackers to find ways to compromise it. Microsoft will even pay cash for their exploits.

The new Edge browser, canary channel (Lee Mathews/Forbes)

The new Edge browser, canary channel (Lee Mathews/Forbes) LEE MATHEWS/FORBES

That’s not as strange as it might seem. It’s actually quite common for a company like Microsoft to offer security researchers rewards for reporting software vulnerabilities. There’s good money to be made hunting “bug bounties” these days.

With Edge inching closer and closer to its first stable release, Microsoft isn’t messing around. Edge needs to be as secure as it can possibly be when it’s released to the general public, and so the call went out for help finding and patching weaknesses.

Researchers can bank up to $30,000 by discovering “high impact vulnerabilities” in either the beta or the developer release of the Edge browser. They can target any OS that Edge will run on: Windows 10, Windows 8.1, Windows 7, and even Mac OS.

The $30,000 top prize will be reserved for bugs that are true showstoppers. An exploit may, for example, have to allow the attacker to break out of Edge’s sandbox — a virtual container that isolates code running in Edge from the rest of the operating system.

A properly-implemented sandbox makes software much harder to hack, which is why Google added one to Chrome. Chrome has been a difficult target for hacking teams at events like Pwn2Own and that has a lot to do with its sandbox.

But a sandbox doesn’t make an app bulletproof. Chrome has fallen on multiple occasions, because given enough time and opportunity hackers will find a way to defeat just about anything. That includes the new Edge.

With the big re-launch coming, Microsoft wants to batten down the hatches. Offering the most secure browser around could be the key to make big gains early on, especially in enterprise environments where IT administrators might deploy an app to hundreds, thousands, or tens of thousands of users to keep networks as secure as possible.

Since Microsoft is offering up its own bug bounty program, the new Edge browser will reap the benefit of outside help on two fronts. Edge is based on Chromium, the same open source code that powers Google Chrome. One of the ways Google keeps Chrome secure is by offering bug bounties to researchers who discover vulnerabilities.

There are plenty of reasons you might want to try out the new Edge. It’s a very slick browser and it’s built on the same base as Google Chrome, so there’s a minimal learning curve when you switch. It may also wind up being the most secure browser around now that Microsoft is willing to make it rain, and that’s a pretty good reason to switch browsers in 2019.


Continue Reading


Facebook Allows Instagram to Create Face Filters




The rate at which Instagram adds new features almost regularly is rather fascinating; this automatically means there’s never a dull moment on the social media platform.

The new addition was done via Facebook’s tool for building augmented reality (AR) effects – Spark AR, which now allows anyone to make custom face filters and other effects for Instagram Stories.

This was previously limited to approved creators, but the new update will let anyone create and upload their own AR filters to Instagram.

Instagram has also introduced an ‘Effect Gallery’ tab on artists’ profiles, which allows artists to display their creations.

The launch of  Instagram AR filters happened in May 2018, but the filters only became popular after more creators joined the closed beta last October. The surreal designer filters have helped boost AR creators’ followings, as effects can only be added to a user’s in-app camera once they follow the creator’s account.

Once a user follows a creator, effects from that account will automatically show up in the Stories camera’s effects tray. Bearing in mind that this update will likely unleash a number of face filters into the Stories ecosystem, Instagram is adding a “Browse Effects” option at the end of the effects tray so users can discover and try new AR filters.

Facebook first announced that Spark AR would be moving out of its closed beta on Instagram at its F8 conference earlier this year. The company says more than one billion people have used AR effects created on the platform, including Facebook, Instagram, Messenger, and Portal.

Now that it’s open to everyone, we will most likely only start to see more AR effects taking over Instagram Stories.

If you are yet to see this new feature in your Instagram story, try updating your app in order to enjoy this feature.


Continue Reading


%d bloggers like this: