Many people are waiting for Apple to fully commit to ‘fixing’ TV. Following on from disrupting the music industry with the iPod and iTunes Music Store and the mobile phone industry with the iPhone and iTunes App Store, when will Apple take on broadcast TV? Also, is their answer TV hardware or software?

“It will have the simplest user interface you could imagine. I finally cracked it.”

One of the most repeated excerpts from the official Steve Jobs Biography, ‘Jobs’ by Walter Issacson, still prompts questions. On the eve of every Apple product announcement event, we wonder whether this time we will find out what Steve meant.

Apple TV hardware

Over six years ago Apple announced their TV ‘hobby’ product: The Apple TV. A small device connected to HD TVs designed to store, stream and play back TV shows and other 720p digital content via a network-connected Mac or PC. They made a point of not promoting it as a major platform at the level of the Mac or iPod. They described it as a product that would help Apple explore future media possibilities. Apple didn’t want analysts to presume that Apple TV would be a second market-disrupting product in the same way that the iPod and iTunes Store was.

Over the years since March 2007, Apple have slowly evolved their hobby.  In January 2008 a software update removed the need for a Mac or PC to purchase via the iTunes Store. Steve Jobs:

Apple TV was designed to be an accessory for iTunes and your computer. It was not what people wanted. We learned what people wanted was movies, movies, movies.

September 2010 saw the biggest change in the Apple TV: the ‘2nd generation’ version dropped the internal hard drive. It was also much smaller and much less expensive. The current 3rd generation Apple TV has a faster processor and more streaming services at full 1080p resolution.

Why does Apple TV remain simply a (very profitable) hobby for the iPhone, iPad and Mac maker? The complex TV and film market in the USA and worldwide.

Read More


Given that 3D is dying, the next great hope for film and TV seems to be UHD TV, Ultra High Definition TV.

Canal+ Spain dubbed today ‘4K Day’

Here is their 4K promotional video that I think was broadcast by satellite today and uploaded to YouTube.

If you have software that can download YouTube videos, you can get this footage if you want to practice your 4K post workflow.

For example if you use Safari and if you have the ClickToPlugin Safari Extension, you should be able to select 4K MP4 from the invisible top-left pop-up menu and then download the 1.5GB file to your computer.

Here is an example of how much detail there is in a 4K frame that was encoded using the UHD-1 flavour of H.264 – 3840 x 2160 at 25fps. The 1.5GB MP4 file had an average data rate of 22mb/s.

Click it to see the pixels at 1:1.


4K is big news for production designers and makeup artists!


Click to see at 3840 by 2160.

Apple’s new streaming service is called iTunes Radio. Not iMusic. What elements could be included that would justify such a name?

The current music streaming services offer very large music libraries to those that pay a subscription fee, or to accept listening to adverts. Although streaming services have features named ‘radio’, they don’t sound much like broadcast radio.

I think there is room for a streaming service that adds elements of radio: the shared experience, regular elements, a reliable schedule.

FutureRadio = Purchased music + Streamed music + Curated music + Shared experiences

Imagine a service that combines purchased music, music that fits well with purchased music and shared audio experiences.

Curated music in this case starts with algorithmically chosen music that works well with the music you want to listen to. iTunes has a ‘Genius’ command that creates playlists of tracks from your library that fit the genre or mood of a chosen track. Spotify has a ‘Radio’ feature that creates playlists based on an artist, genre or time period.

Curation doesn’t only have to come from software algorithms: it could come from the playlist of a favourite radio station or radio show. Radio station playlists change periodically, some every week, some every few months. Radio show playlists could be based on the music played over the previous years, year, month or week. The playlist could be the exact tracks played on a recent show.

There is also a place for non-music content on a radio service.

Read More

iAds is the system where iOS developers fund their work by letting Apple insert adverts into their applications for the iPhone, iPod Touch and iPad.

Google make their money by displaying adverts that are possibly relevant based on what you’ve searched for and the history of how you’ve used Google’s services (search, mail, maps…)

So how will Apple choose what ads to insert into iOS applications? Here’s a quote from their iAds sales site:

And using our unique audience interest and preference data, your ad is delivered to the consumers most likely to respond and buy.

Standard targeting options on the iAd Network include:

Application preferences
Music passions
Movie genre interests
Television genre interests

Looks like Apple is using information gleaned from how you use iTunes on your Mac or PC. It might be that those who downloaded iFart and Jiggle(?) might get served different iAds than those that use ‘Shakespeare’ regularly.

Apple could go further. They could also analyse how you use your iOS device. They could profile you based on which books you buy using iBooks and which videos you watch on YouTube and Vimeo. Could they tell your level of education by analysing your spelling and grammar? They could also judge you by your friends (using the social networking features of iTunes, or even from your Contacts list)…

In the long run, Apple might even be able to detect your state on using your iPhone: “Judging by app use, they seem to be working at the moment. Best not to serve frivolous content” or “The Twitter list they are using is a list of tweets by entertainers, no need to show work-related iAds.”

This sort of analysis will be much more accurate than what Facebook offers its advertisers. However much time people spend in Facebook, Apple is able to gather more information about its users by controlling the devices used to access the internet and all forms of media.

In an ideal world I’d see no commercials, but if I want others to fund TV, radio, podcasts and iOS apps, I’ll put up with some ads. If so, they might as well be relevant to me. If I’ve just bought a car, it’ll be a waste of my time and their money for the media to serve me any car ads for a while. However, in return for this kind of convenience, I need to give up some privacy.

I wonder if the default user option for this kind of profiling be an opt-in or opt-out…

PS: iAds might not just be only for iOS apps

An imaginary ‘media payment preferences’ control
An imaginary ‘media payment preferences’ control.

Fun fact: Apple got a patent for inserting adverts into media at playback back in June 2008. I wrote about the implications back then.

In which I suggest that timestamps of live comments during TV shows could be used to replay them when you catch up with a live event

A few days ago I started to be wary reading blog posts and tweets. I’ve been following the modern version of Battlestar Galactica. I may have not seen every episode, but I’ve enjoyed the most recent series. My social media reticence has been down to the fact that the concluding episode was shown in the US a few days ago. That episode will be on TV tonight. I’m looking forward to seeing it, but I’m off to the London Bloggers Meetup, so I’ll be playing it back later.

In the past I’ve noticed that big fans of some TV shows like to message others while the show is on air. They post messages to forums, tweet, add comments to blog posts in real time – as the episode unfolds.

I’m not that much of a fan to have my eye on a computer screen while watching a great story, but I sometimes like to check out what people wrote once the show is over.

That prompted me to come up with an idea – it might be interesting to be able to follow people’s comments (audio as well as text) if they were optionally integrated into the stream – synced so that they were available at the time they were originally posted.

It would be like having informational subtitles or commentary tracks to a film or TV show – but they could be created by anyone anywhere. If I was selling content, I would provide an option to subscribe to alternate streams (of any content).

This follows on from my post on using a simple tag to extend HTML to allow overlays on everything.

Create a live chatroom with a single click at TinyChat.

In which I suggest that making it easy for everyone to show movies and TV shows to various-sized audiences would revolutionise media.

In these days of democratic production and distribution through digital technology, it’s about time we had a look at the exhibition side of things.

In the UK there has been some support for indie film distribution through the Film Council’s Digital Screen Network. They’ve fitted out over 230 screens around the country with digital projectors. This means microbudget films could even be released on DVD to many cinemas around the UK.

How about adding a few thousand more screens to the programme?

I suggest it would be a good idea for the UK government to combine two aspects of movie exhibition to make it simpler for anyone to create a cinema:

1. Some sort of open-source digital rights management scheme, so that content owners wouldn’t be worried about making their work available for exhibition. This would include automatic payment for rights holders by exhibitors.

2. A one-stop licensing scheme so that amateurs can arrange to pay rights-holders, public liability insurance, get permission from local authorities (and whoever else needs to get involved) for a single price.

Maybe by 2011, movies will premiere all over the country on all sorts of screens.

Any person with a room and projector could simply create a permanent or one-off cinema for whatever content they wanted. Licence prices could be banded so that the economics was straightforward based on audience size.

Films could be made available at different screen resolutions. SD for up to 40 people, 2K for larger audiences. Most TV shows would be cheaper to show, unless you want to show the HD version with surround sound. People would then be able to promote screenings, knowing how many people they need to get to watch. You could set up a season of obscure films or have a weekend party based around watching 23 episodes of your favourite TV show (leading up to a final 24th episode).

Indie and short film makers might get their films shown as part of creative double bills. Once this form of distribution becomes common, producers will be able to calculate how many licences at which price points they’ll need to sell to justify producing an idea in the first place.

The easier it is for movies to find audiences, the better it is for the film industry.

Here’s a link to a previous post on charging for content based on screen size, which implies the size of the audience.

In which I use the social media element of a UK advertising campaign to demonstrate how clients and agencies will need to learn how to trust unsupervised copyrighters with their brands.

If you’re interested in the future of advertising, maybe you should follow Aleksandr Orlov on Twitter or Facebook.

To anyone who’s ever been in interminable edits with a directors, copy writers, art directors, agency people and people from the client, you might not believe the following: with social media, you’ll have to find writers that you can trust.

Given the small number of words and images used in any given TV ad campaign, that doesn’t stop the script being endlessly taken apart an criticised by everyone involved. Given that eventually most campaigns will need to have a social media element, imagine how many more words and images will need to be created and distributed… and approved by someone.

This sort of micro-managing won’t work for social media.

A month ago, a campaign started to promote a car insurance price comparison site called ‘’ – it features a meerkat character who wants the audience to understand the difference between that site and the site he runs: ‘’:

An ad I don’t mind seeing when it comes up on TV, but didn’t feel the need to find out more.

On Monday, writer, actor and TV presenter Stephen Fry got his 100,000th follower on Twitter. On Tuesday, he was up to 110,000. That made him the third most followed (after Barack Obama and CNN) on Twitter. On Tuesday, he also got stuck in a lift for a while with a few other people. While waiting for the engineers, he took a picture and uploaded it to
People trapped in a stuck lift

Twitpic is a site used by Twitter users to share pictures in updates. The update that linked to that picture looked like this in Twitter: ‘Here we are x”

Over 65,000 people have seen this picture. A few hours later ‘aleksandr_orlov’ posted this on twitpic:
Picture of Orlov the Russian meerkat stuck in a lift

Over 3,000 people went to TwitPic to see this image. A day later I wanted to catch up with what Stephen Fry had been doing since – I thought that his getting stuck in a lift would be one of the random events that gets Twitter that bit closer to the mainstream.

Stephen Fry is considered a national treasure in the UK, he is becoming one of the first people British people follow when they sign up with Twitter. That means that some of Stephen’s updates don’t make as much sense as others – they are replies to messages from followers. You need to see the message he’s responding to.

In this case Stephen had posted this update: ‘@alboreto I thought there was someone else in there…..’ – on Tweetree I saw that @alboreto was retweeting the post from @aleksandr_orlov – I didn’t recognise the name or the new character pasted into the lift picture.

I then checked out @aleksandr_orlov’s Twitter profile. Here are some of his recent updates:
@Aleksandr_Orlov's updates

Aleksandr_Orlov is a puppet character from the Compare The Market insurance comparison website UK TV campaign. He has almost 2,000 Twitter followers and almost 200 updates in the last month. Interesting how most of the posts are responses to messages from other Twitter users. All his updates are consistent with the way he is portrayed in the advert. His bio explains his Russian accent and links to his Compare The Meerkat website.

If you want to see social media copy writing in action, visit his profile on Twitter. You’ll see how each answer is tailored to each question from other Twitter users:
Aleksandr_Orlov's conversations

Aleksandr Orlov has almost 80,000 friends on Facebook. Some Facebook notes are directly part of the campaign:

All my friends,
I have made new TV advertisement!
It seem some people still visit my site looking for car insurance deal. So this time I have make absolute clear difference. Only mongoose could not understand.
Please enjoy sneaky preview at

This note got some comments playing along:
Facebook responses to Aleksandr's note
Other Facebook notes maintain the character:

On Friday, I am travel to Miami, USA to see artist Celine Dion perform in ‘Taking Charge World Tour’! This will be 6th time I have seen Ms Dion live perform greatest movie song of all time ‘my heart will go on’. Magical. I will put up story of my trip when I return. I am excite!
Please do not be to much jealous

His channel on YouTube is quieter, but he still makes friends with YouTube users and has extra information about his character:

Movies and Shows: Baywatch, Antiques Road show, Meerkat Manor, Top Gear, Titanic
Music: Tchaikovsky, The Beach Boys, Shania Twain
Books: War and Peace, The Meerkat Mongoose Wars: A History

This sort of quick response to members of the public from the personification of a brand requires that the client trust the ad agency and the ad agency trust the copy writer(s). If someone makes a mistake they can delete a tweet, but they cannot edit it. Deleted tweets look suspicious too. The writer needs to maintain a consistent character 24 hours a day, using the principles of standup comedy and improvisation to respond to other users of Twitter, YouTube and MySpace etc.

The only caveat is that although a campaign may be entertaining for audiences and useful for the CVs of the people involved, the proof of the pudding is in the eating. This will only be a positive case study for the future of advertising if it improves the fortunes of Compare the – we’ll see.

15 years ago I wrote an essay called “What if Media was Media?” It was based around an idea that might interest others, but I wasn’t sure what to do about it. As I wasn’t on the internet back then, all I could do was print it out and give it to a few people who might be able to help me…

The core point was that people may come to understand copyright more deeply because computer file formats will have layers of rights information built-in. In 1994, people hardly ever referred to contents of computer files as ‘media.’ I was imagining a system where all movies, TV, radio and music was created, distributed and delivered in digital forms.

I saw that the flexibility of digital media would make it much easier for old-fashioned media to be copied. To facilitate ubiquitous distribution, I thought it would be interesting if the file format itself included information on the rights-holders.

Imagine buying a video camera, before you first use it, you enter unique contact information (possibly pointing to a .tel registry entry). The camera would then encode your ID into all the footage you shoot. You might even choose a default copyright statement too: ‘©2009 Alex Gollner – For rights see fee table at’

Once the rights information is included with the footage, then every time the footage is played elsewhere, the playback software will determine whether the person watching the footage would want to pay a one-off fee, or license to watch as many times as they want. Of course they could get an advertiser to pay on their behalf:

An imaginary ‘media payment preferences’ control.

They would also choose whether they want to watch on their own, or play it to larger audiences:

The system could also take into account times when footage is incorporated into other productions. If you witnessed the feel-good story of the week – when a talented and brave airline pilot saved passengers and crew by landing his stricken plane on the Hudson – and shot footage that news organisations all over the world wanted to show, they could upload it from your camera. If media rights were encoded into the file, each time the news item is shown on TV, from an archive, streamed on a corporate website or even embedded elsewhere, you would get a cut of the fees paid.

It’s a dilemma. On one hand ‘the little guy’ would automatically get paid. On the other, everyone who has a camera pointed at them will want to know what’s in it for them…

Matt Davis suggested…

An open source subtitle plugin that allows in-sync tweet-style text on ANY non-text media.

Of course I can’t just link to this idea, I’m supposed to add value…

Commentary on the quality of the books available in a local library in the 60s by Orton and Halliwell
Back in the sixties writers Joe Orton and Kenneth Halliwell were first known for the prank when they defaced books from their local public library.

In the 1970s audiences started partici… pating during midnight screenings of The Rocky Horror Picture Show.

I first heard about Hypertext back in 1986 from Peter Brown. He pointed out that every time academics quote text from somewhere else, a link should appear that will take you to the document from which the quote comes.

The silhouette of the MST3K commentary team
Not long after that, Mystery Science Theatre 3000 started in the US. It was a show featuring silhouettes of people making ad-libbed funny comments in front of a series of terrible B-movies. This was followed by more shows featuring ‘unauthorised’ commentary on content such as The Chart Show (to a small extent) in the UK, Beavis and Butthead and Pop-Up Video in the US.

Videodiscs and latterly DVDs popularized commentary tracks and alternative subtitles. These days you can download fan-made commentaries and alternate subtitle tracks (used by those pirating movies into other languages)

Due to the academic uses hypertext was initially put to, I thought it was mainly used to comment on other people’s work to make attribution clearer. That use has fallen by the wayside. Maybe it’s time to revive the idea.

Wouldn’t it be interesting if people could upload commentary that is designed to be overlaid on top of other content – including video and audio. Instead of linking to a page, video or podcast, the content would appear as a new background for the current page. You would then use a layer on top to comment or add to the content below. If a video or podcast played, the player would pass timecode information to the layer above so that comments could be displayed at specific times.

This example shows a pop-up comment overlaid on top of a video on a YouTube page:
Showing how a page could use another page as a background

You could choose how your overlaid comment would look, and how you’d show which page element is being commented on:
A picture showing the darkened background

As well as text commentaries, you could also add picture, audio and video overlays to any content on any page.

This is just the beginning – a way of creating mashups using HTML 5.X…

Either Microsoft is terrible at creating videos, or they have a good sense of an ironically bad video. Check out this submission. I think they know what they’re doing:

Play Microsoft Songsmith demo video

Their newly announced product available from Microsoft Research automatically generates accompanying music to any words you sing into your computer. You can choose key and musical style. You can then go back and change the chord progression if needed. $40 gets you a downloaded application that might be fun.

Songsmith gives me another idea.

smFrontczak: Imagine an application that you tell a story to, it adds sound effects, ambience to your speech and even music to turn your story into a higher production-value podcast or radio play. This would happen using voice recognition to understand the story in conjunction with a large sound effects library. smFrontczak could also enhance radio plays, characters could speak selected stage directions, which could be edited out of the final version.

George: The cathedral's got a mosiac...
Connor: Hurry, it's almost noon!
The children leg it across the bustling
market square and burst into
the murky cathedral.
Mark: The sun! The sun!
The cathedral clock begins to strike noon
(continues over the following)
George: The beam's pointing right at...
...the Blue Knight's shield!

If actors (or a talented individual using different voices) read this script out, the smFrontczak could interpret the script by fading out the busy market square to the left, fade in the cathedral from the left and change the ambience applied to the voices to make them sound as if they are in an echoing hall. Then church bell strikes could commence and continue (with reduced volume) during the scene.

When films are in preproduction, teams are brought in for previsualisation. Storyboarding and animation software (sometimes in 3D) are used to plan scenes to guide many departments. Perhaps the smFrontczak could be used to support the sale of a script in the first place – a tool to turn actor’s readings into dynamic radio plays…

This is the next step on the way to the day when someone will invent a real Holophoner.

The Holophoner is an imaginary device from Futurama, the animated series set 991 years in the future from some of the people that make the Simpsons. It is a musical instrument that uses holographic technology to create 3D operas to accompany the music.

I hope it’ll be a few decades until a real Holphoner appears. In a way, the technology and media industry are paving the way for the day when an individual will be able to compose and perform a complete sensory experience and share it with an audience.

What will audiences need imagination for then…?

%d bloggers like this: