Apple introduced some major changes to the iPad with its iOS 11 beta earlier this week. While you can use the iPad just as you’ve always been able to, there are some additional multitasking changes that really make the iPad Pro more of a laptop contender. Apple has created a dock that acts more like something you’d find on macOS, and refined its side-by-side apps interface so it’s even more similar to Windows 8. These changes make the iPad a lot more useful, but also a lot more confusing than it has ever been before.
Let’s start with the dock
The new dock extends the amount of apps you can pin to the bottom of the iOS home screen, but it also acts as the main way to control what apps can float or be used side-by-side. It replaces the old app picker from iOS 10, and instead of swiping from the right to access apps, you swipe up from the bottom of the display (even if you’re in an app) to access the dock instead. After being familiar with iOS 10, the first time I tried to use the dock I was confused how to get apps side-by-side. I kept swiping from the right or swiping on the apps on the dock, but nothing happened. It’s not immediately obvious how to even activate multitasking, and I had to watch our own hands-on video to even figure that out. Not a great start.
Placing apps into the Split View is simple once you know how — you have to tap and hold on app icons to drag them into Apple’s “Split View”. This might seem like you’re about to move an app icon on the dock, as that’s how iOS works, but it actually turns the app you’ve grabbed into a floating app that can snap to the side of an existing one. This works if you activate the dock by swiping up, but if you’re on the home screen then holding down an app will simply move it around with no options to create a split view. You can also activate a “Slide Over” view by dragging an app from the dock onto an existing one, which creates a hovering window you can arrange on the left or right on top of either a single app or split view apps. This means you can have three apps open in multitasking view. And it’s where things start to get truly confusing.
If you have a hovering app open on top of other apps, the interactions with that app are messy, even for a beta. There’s a little handle at the top which hints you can swipe down on it, but if you miss the tap target then the Notification Center (that now looks like a lock screen) activates. Likewise, closing the app from this view is extremely frustrating. You’d expect you could swipe down on the app to dismiss it, but that swipe down action will actually snap it into a side-by-side view or replace whatever other app you had snapped. To dismiss a floating app you actually have to swipe on the very left edge of the app and push it towards the right-hand side of the screen for it to disappear. To get it back, you then swipe from the right-hand edge of the screen. If that sounds confusing, it’s because it simply is. Swiping from the right-hand edge of the screen normally without any apps floating does nothing, so Apple has re-purposed an edge swipe gesture for a super specific reason.
The split view also includes some oddities when you try to activate a third floating app. You have to drag an app to the bar that controls the split size for it to trigger the floating ability. If you don’t then it will simply replace whatever app is in view. These multitasking controls for three apps are now very similar to Windows 8, and they suffer from the same problems. The gestures aren’t obvious, and managing the different snapped apps states isn’t easy. Apple has taken many of Microsoft’s better ideas with Windows 8, but implemented them in an equally confusing way.
You might be sitting there and thinking “well, who cares if you can use the iPad as you’ve always done, just ignore the multitasking” or “it’s a beta, it will get better.” Those are both good points, but the iPad Pro wants to be more than an iPad. Apple is trying to convince iPad users to upgrade to more capable iPad Pro devices, and it really wants existing laptop users to switch. If you’re buying an iPad Pro then you want the powerful hardware and the powerful software capabilities to match. Apple wants the iPad Pro to be considered as a real computer, but a mixture of its hardware and software is still holding it back from being a true laptop replacement.
Sure, you can ignore these gestures if you want, just like thousands of Windows and Mac users ignore keyboard shortcuts they don’t know about. The problem Apple faces here is that it’s trying to convince everyone that a touch interface is just as productive as using a keyboard and mouse. In many instances that’s true, and as apps adapt to buy into Apple’s world view then that will only improve. With iOS 11 though, these gestures are designed to replace the need for a mouse and they’re getting too complicated. iOS 11 is still in beta so things might improve, but the fact that Apple is struggling to make this stuff easy to use is a fundamental problem with touchscreens and productivity when it comes to precision.
Touchscreen vs. keyboard and mouse
Microsoft attempted to force its Windows 8 interface onto traditional PCs in a vague hope that it would get more tablet apps and boost its mobile efforts. Windows 8 users hated this, because they were used to using a keyboard and mouse for tasks and precision. Equally, Apple is forcing people to use a touchscreen for productivity and it’s confusing its message with optional keyboard and stylus additions to the iPad Pro. This keyboard doesn’t have a trackpad for precision, and you’re forced to move your hands from the keys to reach out and touch most of the time you want to interact. Yes, there are keyboard shortcuts that help, but a lack of mouse input feels unnatural if you’re used to a laptop.
Apple has caved on keyboard and stylus support for the iPad, so it might seem obvious that the company will eventually implement some type of mouse support. I’m not convinced it will, as Apple’s iOS hardware is primarily designed around touch. Apple sees touch as the future, and the iPad is slowly heralding that future. Drag and drop in iOS 11 is an excellent example of that, and a window into the future of the iPad. Software developers have been eagerly awaiting such a feature, and perhaps now they’ll start to invest in more complex and productivity-focused app for the iPad. There are some, like the Aviary photo editor, that are truly great examples of our touch future, but there aren’t enough.
iOS 11 marks a bigger departure from the familiar interface between iPhone and iPad, and it could help better define the iPad in the future. The big question of “who is the iPad for?” is never ending, but iOS 11 seems to signal that Apple is willing to make the fundamental interactions on an iPhone or iPad very different. That could help set a new direction for the iPad. While Apple left it up to software developers to create millions of apps and define the iPhone, the iPad has struggled to generate the same interest from developers thanks to slowing sales and its position between iPhone and Mac.
The complex gestures we see in iOS 11 will only get more tricky in the future as Apple continues to build them out instead of supporting a mouse. If Apple does want the iPad Pro to be considered as laptop contender, then it will need to refine its keyboard hardware and its software gestures and features. Apple still needs to prove that the touchscreen can truly replace a keyboard and mouse for professional productivity, so get used to remembering lots of gestures and swiping around on an iPad display instead of simply pointing and clicking for the foreseeable future.