iCalvin.org

Developing For A Second Display On iPad Pro

Since the iPad Pro came out last month I’ve been using my iPad more and more hooked up to my UltraFine 4K display. I’ve tried a handful of methods, keeping the iPad in ‘laptop’ mode in front of me, laying it flat on the table and using it like a trackpad with the Pencil, and such. It works, but I have a few gripes. Namely, that so few apps will actually take advantage of the external display. In my testing I haven’t found a single app that doesn’t just display in mirrored mode. Sure, it’s not the primary display, and you can’t really interact with it without manipulating the iPad display with touch, but it seems like a waste to have these big black bars on either side of my content, when the content would look so much better if it took full advantage of the display.

I came up with a paradigm to determine what, if any, content to display on the external screen that I’m going to be thinking on for the next couple of weeks though, which I’m calling the ‘Shoulder Method.’ That is, if someone were watching a user interact with your app, what are they really trying too see? Presumably it’s not your UI or buttons, it’s the ‘content’ being worked on. For a text editor that’s the document, for image processing it’s a preview of the image without all the chrome or tools, etc. In Safari for instance, it doesn’t really make sense that the ‘Share’ button is visible on the external display. It can’t be acted on from the non-touch display. However the web content would look great full screen on a 4K monitor without the letterboxing that happens in mirroring mode. So wouldn’t it be great if Safari rendered a second ViewController, with the web content, and maybe the URL field and the other tabs visible, but without the bar button items that wouldn’t be useful on the display? Since you can’t scroll with indirect manipulation a decent workaround, and probably the only difficult part of this coding wise, would be to synchronize the webpage’s scroll offset, so even though the displays have a different aspect ratio and would render content slightly differently, scrolling on the iPad would also scroll on the external display.

I decided to put this into practice with an app I use every day, my Blog Manager app. On the iPad it shows a UISplitViewController with a table view with all of my previous posts as the master and a editing view on the right with fields for title, link, tags, and body editing. Since the body is written in Markdown I have a button that allows you to toggle between Markdown preview and editing modes, but with the external display I could skip this step. Thinking through the ‘Shoulder Method’, what someone (including myself) would want to really if they were watching me write a post is the rendered Markdown content. So I would want a full screen preview of that content taking up the external display, rather than a mirrored mode that shows a table view and a bunch of fields I can’t interact with.

I’ve had this idea for a while, but finally took a couple hours to put it into practice last night, and wanted to document some challenges I found for posterity and in case it was helpful to anyone.

Firstly, I wasn’t able to just check for the presence of a second UIScreen at launch. Some DuckDuckGo-ing suggested that if you aren’t listening for screen connect/disconnect notifications (I was just trying to get a proof of concept at first) you will never actually see the second screen in UIScreen.screens. Sure enough, once I took the step to register as an observer for these notifications I was able to create an empty ViewController and display it on the second screen.

At least I thought. I was seeing a ‘flicker’ once I passed the breakpoint where I actually set the screen window and set hidden = false, but that flicker would only last a frame before it went back to mirroring. Whoops! I was scanning for screens, creating a new UIWindow with a rootViewController and displaying it, but then the method was ending. ARC was seeing that I had lost all references to that window and cleared it out, sending the second display back to mirroring mode! So I set up a dictionary to maintain a reference of the connected screens and the associated windows and finally the second ViewController was staying presented. Nice!

So now I wanted to actually throw some content on there. Well the first thing I’m legally allowed to present is a “Hello World” label, so I initialize a UILabel(), throw a string on it, and add it as a subview to the external ViewController. That’s when I discover a quirk of an external 4K display. macOS sees this display and renders it in an @2x scale (roughly). iOS sees the screen bounds and scales the mirrored content to fit. But in the code I’m seeing this UIScreen with a nativeBounds of 4K resolution (something roughly 4069 x 2304 pixels) with a scale of @1x. So the UILabel I was presenting with the standard system font was taking up a very, very small frame in the top corner of this giant (point wise) screen. I tried adjusting the scale to 2.0, but it’s a read-only property and there was no way to set a preferredScale. I also looked at all the properties of the UIScreen to figure out if there was anything to indicate that the screen supported alternate scales, but alas nothing. The screen did have an assortment of availableModes with various screen sizes, but they were all @1x scale still.

My first approach was to just bump up the fonts a lot. So my UILabel I was creating got set to 40.0 pt font size, and voila, legible! But I don’t want to have everything on this display be ridiculously massive, I wanted to render it at a legible scale.

What I ended up doing was, when setting up the screen and creating the window, first check to see if the screen was greater than 1080p resolution, and if so filtering the availableModes to find an option with half the size of the preferredDisplayMode, which is the default. So with my 4K screen I found an option at half size, selected that screen mode, and set up the window with that alternate frame. Sure enough, things were still kinda small but were at least legible now! I was able to set up a flexible height label and adjust my Markdown -> NSAttributedString method to bump the fonts up just a bit when the string is destined for an external display, and now my blogging app supports Markdown Preview on an external display!

There were a few other tweaks here and there, such as setting up margins for the external screen that didn’t have a useful SafeArea guideline and such, and I need to clean up some things in my ScreenManager class, but it only took a couple hours last night for me to get a pretty decent implementation of this feature working great! One task I think I need to tackle next is manually turning off the external display when appropriate. As it is if I’m presenting my MarkdownPreviewViewController on the second screen and either let the display go to sleep or manually press the sleep button, the external display continues to show the ViewController. I can see how this would be useful for certain apps, but since there isn’t anything that I can do to actually edit a post without the iPad being on I probably want to let the second screen go to sleep as well. What I’ll probably have to do is listen for a notification that the iPad has gone to sleep or that my app has left the foreground and set that second UIWindow to hidden. Don’t expect to have more trouble getting this working, so that of course means I’ll discover 5 different blockers when I get around to coding this.

Well anyway, hopefully this has been helpful to anyone else other than me, but I certainly had fun getting this working and finally digging into some of these external display support APIs. If you have any questions or want to learn more about my Blog Posting app, feel free to reach out!

#ipad #UIKit #displays