UIKit in OS X
OS X is very old. It’s UI framework, AppKit is almost 20 years old (taking its roots in NeXTStep). AppKit has a lot of cruft and iterating it towards modern standards takes a lot of hard work. While UIKit was built on top of CoreAnimation from the start, AppKit had to incorporate it as an option which you can turn on and off. Or consider NSCells vs. recyclable views, or custom drawing code vs. configurable labels in UIKit.
iOS 7 shows how a complete rewrite may look like. If you want to update your app, you have to adapt it to new look and feel. And APIs. If you don’t want to adapt, the OS ships with fully compatible old frameworks to run your app as before.
OS X can use this trick in some future release. It can add to UIKit support of keyboard, mouse, menus and windows. Make it a default environment for the desktop and run older apps on AppKit which ships with OS for compatibility. New apps would have to be compiled and released with new tools and UIKit APIs. Older apps could still be maintained with older tools and compiled against AppKit, but AppKit would not get any enhancements.
This all would help with internals. On the surface users would only notice more advanced graphics and animations similar to iOS. This won’t change much the “feel” of OS X as it would still use keyboard, trackpad and mouse. But things like buttons and scroll views would essentially be the same. Having the same toolkit for both systems would reduce hassle by 80% at least.
Of course, since OS X would run on UIKit which knows about touch already, it would be interesting to think of a practical way to enable touch on conventional notebooks and desktops (if they are still around). That is, how and why vertical screens become horizontal, and how professional interfaces with lots of mouse-friendly elements can be adapted for touch (or why it’s not needed for them). Maybe in interim, OS X UIKit would not accept touches at all, but still provide a great deal of efficiency.
