AirPower Canceled

Apple Cancels AirPower

This is crazy, considering all the signs point to this being close to a launch. I guess now I can finally release that AirPower money I had blocked off for a release this week.

This is a bummer, Apple obviously felt like this was important and close enough to show off. I can’t imagine how hard the engineering team was working on this for the last year.

I wonder when the decision was made? Definitely seems like AirPods we’re ready for a launch in 2018 that was waiting for AirPower, so I can’t imagine it was any more than a month or so ago.


We’re Trying: A Family Podcast

My wife and I are launching a podcast next month. We’re also going to start trying to start a family, which is the much, much bigger deal of the two, but I think focusing on the smaller, more manageable task is helping me keep myself collected.

We’ve been talking about having kids forever, and for various reasons, none of which are terribly interesting if you aren’t in our marriage, we picked a date in March to start rolling the dice.

This is obviously a very personal thing, something many couples who plan to have kids woudn’t share for fear of jinxing it or just oversharing. A watched pot never boils, and that sort of mentality. So why in the world would I announce it all over the internet?

For one, I didn’t want us to go through this alone. We stay in touch with our families and some old friends, but we don’t really have many people in the Boston area who we would typically share things like this with. I’m certain that as this process goes on we’ll have plenty to be excited about, nervous, or anxious about. Doing this show gives us the opportunity to give regular updates to people who care about us and would want to know how it’s going.

Also, I’ve been very touched by the stories I’ve read online documenting people’s struggles with creating families of their own. Casey Liss and Daniel Farrelly in particular come to mind, but I’ve read countless stories from people who feel, at various points in the process, alone, scared, hopeless, or depressed. I don’t know what’s in the cards for us, how easy or difficult this is going to be, or what setbacks we’re going to encounter. I do know that I don’t want to feel like we’re the only ones going through our struggle, and I’m hoping being open about our experiences can either help ourselves or help others during this exciting, terrifying time.

But mostly, I’m just hoping that this will be a fun thing for Rosalee and I to do together. I think it’ll be good to have dedicated time set aside to check in with each other, make sure we’re ok, laugh, cry, do whatever we need to do to sync up. I love my wife so much, and I love finding opportunities to spend time with her.

The show is called We’re Trying, which I thought was very funny. I don’t really care if you do or not 😆. I wanted it to evoke not only literal truth of what we’re doing, but also get across that we’re just going to be trying our best. At times along the way we will certainly fail or screw up, but at the end of the day the best you can do is give it your all, and that’s what we’re gonna do. And if we like this experience, maybe we’ll continue the show after we have the kid! Chronicle the first years and all that. I guess I just like the idea of archiving this experience for the same reason I run this blog and take so many pictures of my wife and cat. So I have a reference for a time in life I expect to love :)

So if you wanna follow along on our journey you can find the show on Apple Podcasts and subscribe now. We’ll be publishing the first episode sometime in mid-March, so keep an eye out!

We’re Trying Show Art

#family #pregnancy #children #wife #podcast

null_resettable Equivalent In Swift

Swift doesn’t have an annotation to indicate what in ObjC would be a null_resettable property. These are properties like UILabel.font, which never will return nil, but do allow you to set nil to reset to a default value.

In the past I’ve gotten around this by defining a get-only non-optional variable, and a function that acts as a setter which takes an optional. What I just learned from a coworker is that there’s a much better way to do this!

If you define your variable as an implicitly unwrapped type, in the set closure you can handle a nil case! So you can define a computed variable that resets to a default value when you define it as nil as such:

var foo: Bar! = .default {
  get {
    // Do some work to generate the variable here
  set {
    var toSet: Bar = newValue ?? .default
    // Do whatever with the non-optional variable

Hope this proves helpful to some of y’all! It makes me nervous as hell adding a bang to my Swift code, but this seems like one of those few times where it’s a great idea!

#swift #learning #woah

Session Recording Apps Exposed By TechCrunch

Zack Whittaker at TechCrunch on iOS session recording tools

I was so, so happy to see this story breaking last night. This very common practice needs to be called out for what it is, an invasive tracking package that promises insight on users, collects data that may or may not be useful, and does the absolute bare minimum to protect user privacy. Actually, now that I think of it, I’m still waiting on a comparable ‘expose’ article on the rest of the analytics industry, but I’ll take what I can get.

I’ve had friends reach out in the past to show things like this as a helpful tool their company uses, and we used to use something similar when I worked on a React Native app, although rather than recording the screen it tracked state changes and could play them back in time.

Look, I get it. You want to be able to see what users are doing, how they’re using the app, where they’re getting confused, what triggers crashes. That’s can be useful information. The problem comes when packages like this make it super easy to just turn on for all users, all the time. Most of those sessions are not uniquely insightful. Most of them will never be reviewed. And yet they’re tracked and stored, creating a potential attack vector for the users.

I see this same attitude at work with the analytics packages we use, in particular Firebase. At any time we can head to that website and take a peak at our users as they use the app. We can see what actions they’re taking and get all sorts of information about them that we don’t need, like interests & hobbies, other apps they use, etc. I assume this profile is built from information Google knows about the user, but is augmented by the data that other apps are reporting to Firebase as well. Worst of all is that we get a map showing where in the world the user is as they use our app, which doesn’t require location permission. The fact that the app I work on contributes to this awful behavior is a source of great shame.

There are people who will defend this behavior. At the worst they’ll suggest that the issue blocking fields can be fixed, and that then there will be nothing wrong with these packages. At the best, there will certainly be an army of people saying that every large-scale app or service does this, and that users understand and expect this tracking as they use apps. To be honest that was also my thinking, despite this practice being abhorrent I thought most people dealt with it knowingly like they deal with ads. The fact that this story is blowing up right now gives me some hope that this may be a reckoning, that consumers (or Apple) will get upset, and these packages will start to be much more tightly controlled. Hopefully iOS will require users to opt in before it allows arbitrary screen capture, although I imagine this will be tricky to implement.

Here’s a tip to companies, like mine, that want to continue using these aggressive, immoral tracking practices.

If you want some piece of data from the user, ask them for it.

If they user gets creeped out by this request, let them say no.

If they do say no, respect that decision.

‘Users might say no’ is never a reason to collect data from or about them without asking them for it.

#privacy #analytics #tracking #ios

Fireside Cocoa ‘19

Last weekend I reached outside of my comfort zone a bit.

Curtis Hurbert, Indie Dev and maker of Slopes, hosts a yearly retreat for Apple platform developers and friends. I’d never met Curtis before, I only knew him through Twitter, I’m generally socially anxious around new people, and I’ve only been snowboarding twice before, most recently in 2012, so there were plenty of reasons for me to ignore this event.

Despite those concerns, I was still interested when Curtis tweeted about opening up Fireside Cocoa 2019. I’ve always enjoyed opportunities to come together with members of the iOS Developer community, either at WWDC adjacent events or smaller conferences like SwiftFest, and this seemed like not only would this allow me to spend time with, get to know, and ask questions of developers much smarter and more successful than myself.

Since this event was distinctively not a conference though I was pretty nervous about taking this leap without the safety net of talks or sessions that would provide an easy conversation topic, and also about my lack of snowboarding expertise. In the weeks leading up I was getting more and more anxious, wondering if I should just bow out, email Curtis saying something else came up and offer my place to someone else who wanted to join. It would have been a very silly, cowardly thing to do, but I’ve been known to give in to that anxiety in the past when it comes to events like this. I couldn’t be happier that I resisted that urge.

The event was a blast. I made some great new friends, talked a lot about code habits, testing, and I got to offer up my knowledge when I had answers to a question. I got to spend time in Vermont, which is just lovely this time of year, play around with Slopes (thanks for the Day Pass Curtis!), and fall down a mountain a lot. Not to mention the evening hot tub sessions, complete with the occasional snow bath to keep the blood pumping.

It was great to feel like I became a deeper part of this community I love so much, and to get the opportunity to meet, chat with, and learn from people I really respect.

And I’m glad everyone got to see me make a fool of myself playing Just Dance on Switch :)

Among other things the weekend really made me realize that I should take part in more meetups around Boston. Most of the crew knew eachother from CocoaHeads Philly, and while I’ve got to a couple iOS Dev focused meetup events I’ve never become a regular presence, preferring to maintain my own social circle of developers through Coffee & Code type gatherings. I think once I go full remote (another goal of 2019), it’d be a good way to get out of the house and maintain a presence in the Boston area. If you go to SwiftCoders or CocoaHeads Boston, reach out! I’d love to meet you and chat about these events. I’ll be heading to the next sessions of both (skipping Feburary CocoaHeads because V-Day, but I’ll be there in March.)

So thanks again to Curtis and the whole Fireside Cocoa 2019 crew for putting together such a fun, educational, somewhat relaxing weekend. Can’t wait to do it again in 2020!


CastKit 1.0

CastKit started out because on a whim I booked the Apple Podcast Studio at WWDC. I didn’t know what I would talk about, who would be on the show, how it would be hosted or published, or anything about audio editing for that matter. I had a couple friends, and made some new ones, at the conference who I somehow got to agree to spend an hour in a room talking to my about the week, and by the end of the day we had a show!

I put together some rudimentary show art, just my website favicon with a different background color and some radio waves, and a manually written RSS feed, but it was enough to publish before heading out for the evening. I was able to find my new ‘show’/episode in Apple Podcasts a few days later. It was cool. It reminded me of the first time I saw my name in the App Store search results.

Over the next few months I experimented a bit with the format of this new show. I tried a solo show diving deep into an iOS 12 API, I brought on my web dev friend Austin to talk about Apple’s market cap and missed opportunities, and after the September event I brought on my local iOS Dev friends to talk about the new phones and watches. That felt like a good format, so I invited them back a few times. It was fun.

All the while I was learning a lot about podcasting and audio work. I bought a decent mic, then an arm, then an audio interface. I invested in Audio Hijack and Ferrite so I could improve the flow of my pre/post production process. I learned how much easier editing was when I didn’t try to cut every pause, ‘um’, and breath, not to mention how much more natural the result sounded.

I had been (accidentally) been hitting a once a month pace, and that felt good! Not too much work, plenty of time to plan guests and edit a show, and there wasn’t much pressure to speed things up.

Well it’s a new year, and I’m ready to kick things up a notch. In December I asked Bob and Steve to come on as regular co-hosts, and I’m ready to do the thing I didn’t back in June and actually ‘launch’ this show. I spent some time making some new show art, I’m going back and editing out swears so people can listen around their kids and I don't need to have an explicit tag next to my nice new art, I created a new twitter account, and I’m in the middle of a mini-marketing blitz. It's a fun oppertunity to get out of my comfort zone and do a little self-publicity for a bit.

So all this to say, thanks for reading my blog, and if you have, thanks for listening to CastKit. It's really been a labor of love and I'm becoming more and more proud of it, and I hope that it's something that people, even just one person, can enjoy. I've got some fun ideas on where to take the show for the next year, and I hope you come along for the ride! Keep an eye out for new episodes on our new schedule, every 2nd & 4th Wednesday of the month.

Promo Image for CastKit with new show art

#castKit #podcast #self-promotion

Revisiting The iPad (2010)

So I’m staying on Portsmouth for the weekend with family, and our hotel is fantastic. Sonos sending music to the sitting room, five taps of coffee options, great snacks and atmosphere, but the kicker were the iPads left in every room as an guide to local restaurants and activities. Since I’ve had iPads on the brain ever since the new Pros came out last month I decided to have some fun and write a bit about going back to the original, 2010 iPad model after all this time.

This device was my first mobile college computer, paired with the iMac in my dorm room I was able to take care of lots of school work and lots of screwing around, so I’m very much enjoying this blast from the past. These iPads weren’t able to run anything past iOS 5.1.1, so that’s where we find this old model still sitting around. I fondly remember iOS 5 for two main features.

The first was iMessage, that first break in the standard paradigm that text messages were something tied to a phone number and could only be managed from that phone. It’s amazing to think how far iMessage has come, moving to the macOS with Mountain Lion, allowing you to use your phone number as your ‘Send From’ source on any device, the advances in keeping everything synced and up to date on every device, all going back to this iOS update in 2011. Meanwhile, Google has tried and tried again to bring a similar ease of communication to Android ever since and still haven’t settled on an approach. I looked deep in the settings and it doesn’t appear there’s any particular device management going on here, so I figured it would be safe to sign into iMessage on the room iPad so I could send some screenshots to myself (AirDrop didn’t come to iPad until 2013 with iOS 7). When I entered my iCloud password however I was reminded that this OS also didn’t support TFA for iCloud, but it appears they have a clever workaround. I got the security code on my other device, and was prompted to append the code to the end of the password field before I logged on. But iMessage in iOS 5 was tied not to your phone number, but rather your iCloud account. I remember distinctly being told by my friends to please stop texting them from my email address, because it was just weird. Very glad it didn’t take them long to fix that, although the flip side is that it’s been a pain for people to switch to Android ever since.

My other favorite thing about this update was iTunes WiFi sync. A few years earlier I had found someone on Reddit looking for beta testers for an app/plugin they were working on that would allow iTunes WiFi sync. The app worked well, but not unexpectedly Apple rejected the app and it only ever found life on the Cydia store. Well sure enough a couple years later we got that functionality anyway. It was slow, it was limited, but it was a big step in the move to making iOS devices more independent from iTunes, especially paired with Over-The-Air updates and fully local setup from iOS without having to go to iTunes, both features of iOS 5 as well.

The more I think of it, it was a really good year.

Taking a step back a bit, I wanna talk about the hardware. I can’t believe this speaker design was ever used for real content, it’s hilariously small even compared to the iPad 2-4’s speaker design. And the plastic buttons alongside the slick aluminum edges of the device just feel cheap, even back then. The non-retina screen makes looking at photos or modern web pages quite a pain. The other pain? 512mb of RAM. Every time I switch applications I have to see a splash screen and wait for the app to launch again, even when I just had to switch out for a few seconds. It’s amazing how we were able to get by on something so constrained for so many things, especially when I think back to papers I had to write on this device, jumping back and forth between Pages and iBooks or Safari to pull a quote or revisit a section of text. I guess it’s hard to go back to these constraints after being freed from them, and it gives me a little more appreciation for iOS 12’s multitasking and 4GB of RAM.

One last note about the hardware. Obviously the return to the 90 degree sides of the device was a return to the form of the original iPad. One thing I didn’t notice however was that the new iPad’s is almost exactly as thick as the straight edges of the first iPad. It’s almost as if they took a laser and cut off the bulge of the first iPad to end up with the new industrial design, like the iPad Air 2 ad. It’s really amazing to see how this device has grown and improved over the last 8 years, while still being true to it’s essence.

As for the software, it was really fun (and a bit challenging) to use this device over the weekend. The animations were limited, but fun. I forgot about the ‘shuffle’ animation when switching apps via the multi-tasking drawer for instance, and when I opened Game Center the top games from the App Store were ‘tossed’ onto the felt table like playing cards being dealt. Newsstand looks as beautiful as ever, because this iPad was loaded up with a couple dozen apps unlike most every other device that opened that special folder. There was no Podcasts app, so I had to search the iTunes store to add a Pod, and there didn’t appear to be a ‘subscribe’ option. Don’t remember how that worked back in the day, or if it was entirely managed by iTunes. And the skeuomorphic designs in Reminders and Calendar were just as silly and beautiful as ever.

I replaced my original iPad in my sophomore year with "The New iPad”, the 3rd gen with a retina display. I held onto the original for a couple of years, using it as a media remote in my off campus house and eventually passing it off to my parents, who hopefully still have it laying in a drawer somewhere. But this surprise blast from the past was quite fun to revisit and reexamine with a modern lens. Maybe next I’ll power up an iPhone 4 I have laying around and give that a look!

#iPad #ios

Peanuts By Apple

Apple producing new Peanuts content

I’m stupid excited about this. My wife adores Peanuts and we make sure to watch the holiday specials throughout the year. Hopefully these are a step above some of the later specials in quality 🤞

#tv #media #peanuts

Developing For A Second Display On iPad Pro

Since the iPad Pro came out last month I’ve been using my iPad more and more hooked up to my UltraFine 4K display. I’ve tried a handful of methods, keeping the iPad in ‘laptop’ mode in front of me, laying it flat on the table and using it like a trackpad with the Pencil, and such. It works, but I have a few gripes. Namely, that so few apps will actually take advantage of the external display. In my testing I haven’t found a single app that doesn’t just display in mirrored mode. Sure, it’s not the primary display, and you can’t really interact with it without manipulating the iPad display with touch, but it seems like a waste to have these big black bars on either side of my content, when the content would look so much better if it took full advantage of the display.

I came up with a paradigm to determine what, if any, content to display on the external screen that I’m going to be thinking on for the next couple of weeks though, which I’m calling the ‘Shoulder Method.’ That is, if someone were watching a user interact with your app, what are they really trying too see? Presumably it’s not your UI or buttons, it’s the ‘content’ being worked on. For a text editor that’s the document, for image processing it’s a preview of the image without all the chrome or tools, etc. In Safari for instance, it doesn’t really make sense that the ‘Share’ button is visible on the external display. It can’t be acted on from the non-touch display. However the web content would look great full screen on a 4K monitor without the letterboxing that happens in mirroring mode. So wouldn’t it be great if Safari rendered a second ViewController, with the web content, and maybe the URL field and the other tabs visible, but without the bar button items that wouldn’t be useful on the display? Since you can’t scroll with indirect manipulation a decent workaround, and probably the only difficult part of this coding wise, would be to synchronize the webpage’s scroll offset, so even though the displays have a different aspect ratio and would render content slightly differently, scrolling on the iPad would also scroll on the external display.

I decided to put this into practice with an app I use every day, my Blog Manager app. On the iPad it shows a UISplitViewController with a table view with all of my previous posts as the master and a editing view on the right with fields for title, link, tags, and body editing. Since the body is written in Markdown I have a button that allows you to toggle between Markdown preview and editing modes, but with the external display I could skip this step. Thinking through the ‘Shoulder Method’, what someone (including myself) would want to really if they were watching me write a post is the rendered Markdown content. So I would want a full screen preview of that content taking up the external display, rather than a mirrored mode that shows a table view and a bunch of fields I can’t interact with.

I’ve had this idea for a while, but finally took a couple hours to put it into practice last night, and wanted to document some challenges I found for posterity and in case it was helpful to anyone.

Firstly, I wasn’t able to just check for the presence of a second UIScreen at launch. Some DuckDuckGo-ing suggested that if you aren’t listening for screen connect/disconnect notifications (I was just trying to get a proof of concept at first) you will never actually see the second screen in UIScreen.screens. Sure enough, once I took the step to register as an observer for these notifications I was able to create an empty ViewController and display it on the second screen.

At least I thought. I was seeing a ‘flicker’ once I passed the breakpoint where I actually set the screen window and set hidden = false, but that flicker would only last a frame before it went back to mirroring. Whoops! I was scanning for screens, creating a new UIWindow with a rootViewController and displaying it, but then the method was ending. ARC was seeing that I had lost all references to that window and cleared it out, sending the second display back to mirroring mode! So I set up a dictionary to maintain a reference of the connected screens and the associated windows and finally the second ViewController was staying presented. Nice!

So now I wanted to actually throw some content on there. Well the first thing I’m legally allowed to present is a “Hello World” label, so I initialize a UILabel(), throw a string on it, and add it as a subview to the external ViewController. That’s when I discover a quirk of an external 4K display. macOS sees this display and renders it in an @2x scale (roughly). iOS sees the screen bounds and scales the mirrored content to fit. But in the code I’m seeing this UIScreen with a nativeBounds of 4K resolution (something roughly 4069 x 2304 pixels) with a scale of @1x. So the UILabel I was presenting with the standard system font was taking up a very, very small frame in the top corner of this giant (point wise) screen. I tried adjusting the scale to 2.0, but it’s a read-only property and there was no way to set a preferredScale. I also looked at all the properties of the UIScreen to figure out if there was anything to indicate that the screen supported alternate scales, but alas nothing. The screen did have an assortment of availableModes with various screen sizes, but they were all @1x scale still.

My first approach was to just bump up the fonts a lot. So my UILabel I was creating got set to 40.0 pt font size, and voila, legible! But I don’t want to have everything on this display be ridiculously massive, I wanted to render it at a legible scale.

What I ended up doing was, when setting up the screen and creating the window, first check to see if the screen was greater than 1080p resolution, and if so filtering the availableModes to find an option with half the size of the preferredDisplayMode, which is the default. So with my 4K screen I found an option at half size, selected that screen mode, and set up the window with that alternate frame. Sure enough, things were still kinda small but were at least legible now! I was able to set up a flexible height label and adjust my Markdown -> NSAttributedString method to bump the fonts up just a bit when the string is destined for an external display, and now my blogging app supports Markdown Preview on an external display!

There were a few other tweaks here and there, such as setting up margins for the external screen that didn’t have a useful SafeArea guideline and such, and I need to clean up some things in my ScreenManager class, but it only took a couple hours last night for me to get a pretty decent implementation of this feature working great! One task I think I need to tackle next is manually turning off the external display when appropriate. As it is if I’m presenting my MarkdownPreviewViewController on the second screen and either let the display go to sleep or manually press the sleep button, the external display continues to show the ViewController. I can see how this would be useful for certain apps, but since there isn’t anything that I can do to actually edit a post without the iPad being on I probably want to let the second screen go to sleep as well. What I’ll probably have to do is listen for a notification that the iPad has gone to sleep or that my app has left the foreground and set that second UIWindow to hidden. Don’t expect to have more trouble getting this working, so that of course means I’ll discover 5 different blockers when I get around to coding this.

Well anyway, hopefully this has been helpful to anyone else other than me, but I certainly had fun getting this working and finally digging into some of these external display support APIs. If you have any questions or want to learn more about my Blog Posting app, feel free to reach out!

#ipad #UIKit #displays

CastKit V: All Of My Hands Aren’t Enough

Thursday some very good friends and I talked a lot about the new iPads, why we did or didn’t buy them, and what we hope for the future of this computing platform. As an experiment, which I’m sure I’ll be repeating, this Pod was completely edited and published from my new iPad Pro using Ferrite and Transmit. I’ll write up some more about the editing experience later, but for now suffice it to say it was so much more fun than working on my Mac in GarageBand and the ability to get a few minutes done on the train with the Apple Pencil was incredible.

Sorry for some weird audio moments, I’m still learning audio stuff and we had a bad QuickTime export force us to use backup audio, which had a few minutes of odd echo.

Overcast Link

Apple Podcasts


#ipad #drawing #iOS #podcast

More Posts