Archive

User interface

When Steve Jobs launched the iPhone 4 in March 2010, one of the big new features was a much higher resolution screen. The iPhone 3GS 3.5″ screen displayed 320 by 480 pixels. The new phone displayed 640 by 960 pixels in the same space. The number of pixels displayed per inch increased from 163ppi to 326ppi.

Developers didn’t have to change the layouts of their applications to run on the new phone. Instead of displaying 320 by 480 apps at half the size on the 640 by 960 screen, the new version of iOS used twice the pixels horizontally and vertically to show the same content. Since then developers have designed their iPhone apps to work on 320 point wide screens even though the vast majority of users will see them on 640 pixel wide screens.

Apple marketed this new screen as a ‘Retina display’. Apple later said that the definition of a  Retina display is a screen where individual pixels cannot be distinguished at a normal viewing distance. In the case of the new phone, it would have to have a screen resolution of at least 300ppi when used at a distance of 10-12 inches. The combination of figures is summarised as ‘Pixels per Degree’ – the number of pixels per degree as seen from a specific distance. 300ppi at 10″ equates at a PPD of 53. The iPhone resolution of 326ppi at 10″ has a PPD of 57.

Minutes after the launch, Apple fans started speculating when other products would get a Retina display update. The top of the range iPod Touch followed in September 2010. The iPad got a Retina display in March 2012, followed soon after by Retina MacBook Pros in June 2012.

With every new launch event, may hope for a Retina display as part of the specifications for new Apple products. But what does Retina mean for iMacs and desktop displays?

Read More

In August 2010 (almost a year before the introduction of Final Cut Pro X) Apple applied for a user interface patent that is relevant to colour correcting video clips. They were awarded patent 8,468,465 today.

Although Apple has chosen a different UI for colour correction in Final Cut Pro, the UI shown in this new patent may turn up in future Apple applications.

Abstract

Some embodiments provide a computer program that provides a graphical user interface (GUI) for controlling an application. The GUI includes a contiguous two-dimensional sliding region for defining several values. The GUI also includes several sliders for moving within the sliding region. Each slider selects one or more values from the several values based on a position of the slider within the sliding region. The selected values are parameters for controlling one or more operations of the application.

Excerpt

2D-sliders_fig_16

Read More

Last week, Bret Victor posted A Brief Rant on the Future of Interaction Design, a must-read essay for those who think that the future of technology interaction will be primarily spent stroking flat panes of glass:

[T]ake out your favorite Magical And Revolutionary Technology Device. Use it for a bit.

What did you feel? Did it feel glassy? Did it have no connection whatsoever with the task you were performing?

I call this technology Pictures Under Glass. Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade.

[…]

With an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?

Perhaps he is right. Minority Report‘s computer interactions have been very distracting for many OS user interface designers.

Read More

Part of the art of writing patents is to protect concepts that might be used in future products without delineating them too clearly.

Case in point: Apple was awarded a patent yesterday: Gestures for controlling, manipulating, and editing of media files using touch sensitive devices. Here’s the abstract:

Embodiments of the invention are directed to a system, method, and software for implementing gestures with touch sensitive devices (such as a touch sensitive display) for managing and editing media files on a computing device or system. Specifically, gestural inputs of a human hand over a touch/proximity sensitive device can be used to control, edit, and manipulate files, such as media files including without limitation graphical files, photo files and video files.

Seems mainly about Apple getting a patent for gestures used to edit video on multi-touch devices. But I think the interesting phrase there is proximity sensitive device. That means we’ll be able to edit without touching a screen (or wearing special gloves).

Hidden in the middle of the patent are the following two sentences:

Finally, using a multi-touch display that is capable of proximity detection … gestures of a finger can also be used to invoke hovering action that can be the equivalent of hovering a mouse icon over an image object.

Ironically, one of the arguments against making Flash available on multi-touch devices is the fact that the majority of Flash implemented UI elements use the position of the mouse pointer without the mouse button being clicked as useful feedback to the user – a concept not possible using multi-touch. If devices included advanced proximity detection technology, then ‘mouseover’-equivalent events could be sent to Flash UIs – so they’d work they way they have since Shockwave and .fgd files.

Although granted yesterday, the patent was applied for in June 2007. In August 2007, I wrote about gestural edits that required the UI being able to detect fingertip position while not touching the screen.

I also wrote about Apple being granted a patent for using a camera mounted to a portable device to detect hand movement in three dimensions.

Given that Apple say that Final Cut Pro has been rewritten from the ground up, it is very likely that it stores its information in a database that will be available to other applications and users. It is likely that multiple users will have access to the database at the same time.

That means new collaboration opportunities.

Sound

Given that the new interface is much clearer at helping users establishing and changing sync between clips or all kinds, it makes it easier for sound editors to work on the same timelines as picture editors. They’ll be able to do a great deal of work on audio sweetening (including fixing sync on clips) while the picture editors continue to work. For audio specifically this would work better if the position of one audio clip – a voiceover for instance – could define where other clips dipped their levels.

Editing

Collaboration works best when each user can easily understand which parts of a project they can have a look at and modify.
A suggested user interface showing that a compound clip is unavailable for editing.

Perhaps collaboration between editors will be afforded by ‘checking out’ compound clips on a master timeline. ‘Checking out’ is a database term that means an individual record will be locked while a specific person makes changes, but it still can be looked at, and other parts of the database can still be changed. In the case of Final Cut Pro X, the primary editor would be able to see a compound clip is being worked on by an assistant while temporarily unable to edit it.

As well as being a repository of a signed off B-Roll sequence, compound clips could also contain the verse in a music video, a scene in a movie or an episode in a web series. While I work on the verse/scene/episode, another editor could be repositioning the compound clip in the movie or even splitting it into two.

The Audition feature allows a given compound clip to display different choices chosen using a coverflow-type display. Perhaps instead of choosing between individual shots, in future editors might choose between different edits in a compound clip. Senior editors could use this feature quickly compare old and new edits by an assistant.

External suppliers

Maybe we’ll be able to give external suppliers access to chosen parts of our FCP X projects:

    Transcribers and translators (Given access to specifically-tagged ranges within clips, to which they’ll be able to directly add subtitling/close-captioning/alternate soundtracks)
    Picture graders
    Visual effects houses

We might even be able to choose which elements of our project are backed up on iCloud.

Other applications

Fun fact: If the Mac or iOS device is the dongle for Final Cut, then Apple loses nothing by allowing developers to create applications that have peer access to FCP X project databases – Core Data databases can only be served by OS X and iOS devices. That means if there is a market for applications that can edit FCP X data using a 20th century old-style (Avid/Premiere/FCP 7) UI, then Apple will allow third parties to develop them. The more applications that can manipulate FCP X project databases, the more high-margin iMacs Apple will sell.

Legacy tape & digital formats

There is a commonly believed rumour that Final Cut Pro X won’t have any DV and HDV tape capture features.

If Apple want metadata-based workflows to become more popular (preparing the way for more solutions to replace Unity setups at TV stations), it is likely that they’ll want metadata to work well with all media.

That means that if you import an FCP 6-7 project into X and spend the time adding metadata tags to people, places, shots, takes and regions, it is unlikely that Apple would throw that work away if someone decided to do a batch recapture in an older version of FCP and reimport the QuickTime files.

If an FCP X project is a database it is likely that if tape capture isn’t built into X, there’ll be another tool that will be able to add or change the database for those with tape-based content.

Batch recapturing doesn’t apply to tapes alone…

Given the multi-format editing nature of FCP X, the digitising tool might also be able to ‘re-link’ tapeless source data to projects that were converted from FCP 6 and 7. It makes sense that if you’ve used a H.264 or Red to ProRes 422 LT workflow in FCP 7, you could open the project in X to continue work after linking the timeline to source Red or H.264 data.

Please note that this post shouldn’t be counted as a rumour, just speculation: now that Final Cut Pro has been rewritten, there could be new opportunities to improve its collaboration features.

The latest Mac rumour is that Apple will announce a multitouch trackpad for desktop Macs.

For this new device to be useful, Apple needs to define how multitouch works when you don’t look at what you’re touching, but need to be accurate. At the moment, you can use MacBook touch pad gestures for a variety of commands (previous page, next page, scaling), but these instructions don’t require accuracy of fingertip positioning.

In order for us not to look at the ‘magical’ touchpad we’re using with our Macs, we need to know where our touches would land in the user interface of the current application if we touched the pad in a specific position. That means we can look at the monitors we already have, but still get the benefits of multitouch manipulation.

In August of 2007, Apple patented a multitouch interface for a portable computer that uses a camera to detect where a person’s hands are before they touch the trackpad or keyboard.

Illustration from Apple's 2007 Multitouch patent featuring a camera detecting where a user's hands are when not touching the trackpad.

Now that we have a device for detecting where our fingertips are, Apple need to update the UI guidelines to allow for multiple cursors (one for each fingertip) and let fingers not touching the trackpad still send a ‘hover’ message to user interface items.

For example, they could use filled and unfilled circles to show where fingertips are. Unfilled to show where fingertips are hovering over the trackpad, filled to show contact:

A screenshot from Final Cut showing a fingertip touching one edit and another hovering over a different edit.

In this Final Cut example, one fingertip is touching an edit, another is hovering over a different edit. To select more than one edit, editors hold down the option key and click additional edits. In a multitouch UI, the editor could hold down a fingertip on the first edit and tap the other edits to extend the selection:
Final Cut screenshot showing four edits selected

The hovering fingertip circles could also show the context of what would happen if the user touched. Here’s an example from Avid:
Mockup of multitouch UI extensions to an Avid screenshot.

Here the editor has their left hand over the multitouch trackpad. The index finger is touching, so its red circle is filled. As we are in trim mode the current cursor for the index finger is the B-side roller because it is touching a roller. The other fingers are almost touching. They are shown with unfilled circles with faint cursors that are correct based on where they are on the screen: the middle and ring fingers have arrow cursors, if the little (pinky) finger touches, then it would be trimming the A-side roller.

Once you can directly manipulate the UI using multiple touch points, you’ll be able to get rid of even short-lived modes. I wrote about gestural edits back in 2007.

Ironically, once Apple does provide multitouch UI extensions to OS X, then the concepts of hovering, ‘mouseenter’ and ‘mouseleave’ can be added to Flash-based UIs for those using multitouch devices. Oh well!

In which I provide a link that compares HD video web page embedding options.

The design of this blog means I can’t show how large HD video embeds can be on web pages you control can be. This (non-designed*) page on my site compares size and quality between YouTube and Vimeo HD video.

The maximum embed size for YouTube is 853 by 480 and the service is free. If you pay $60 for the Vimeo ‘Plus’ service, you get 25,000 HD embedded plays at 1280 by 720.

a_yard

A still from the HD video embedded on my site using both YouTube and Vimeo.

*I still avoid any HTML features more modern than tables

%d bloggers like this: