Kaiser Chiefs Album Generator - Specialmoves
Kaiser Chiefs:
Album Generator
Specialmoves Labs
Working with our friends at W+K London we launched a site for the Kaiser Chiefs. Well it’s a bit more than a site, it allows you to create your own version of the band’s new ‘The Future is Medieval’ album. Choose ten songs from a list of twenty, design your own artwork and get a £1 kickback for each one sold.
Technical Objectives
- To create a process whereby you can create a bespoke user-defined album with custom ID3 tags embedded.
- To create a process sufficiently robust and intelligent to scale up/down depending on site traffic to consistently deliver fast album generation.
What were the unknowns?
- How to change the meta-data in the ID3 tag of an mp3
- How to archive the mp3s once they had been tagged
- How long it would take to process one album request
- How many albums would be required per day
- How to deliver the content to the customer
- What was the process?
The solution was to be Windows based as the servers that the hosting company were using were on a Windows OS. Writing a window service using .NET seemed to best way forward (especially as the main site was written in MVC 3.0 and we wanted to maximise as much code reuse as possible).We kicked off the project with a proof of concept in a console app. Once we were happy with it, we created the final windows service that would generate the album. Windows services enable you to create long-running executable applications that run in their own Windows sessions. These services can be automatically started when the computer boots, can be paused and restarted, and do not show any user interface.The complicated bits of this application was the ID3 tagger and the archive compressor. We did some research and quickly found some great open source frameworks.For the tagger, we first came across http://id3lib.sourceforge.net/ but found it overkill for what was needed. We’ll look at this for another project though. Looks great! We opted for taglib-sharp http://taglib-sharp.sourcearchive.com which was a much smaller library that served our needs. As you can see from the example below, it’s really easy to update an mp3 tag:
var filename = “c:\test\song1.mp3”;
var mp3 = TagLib.File.Create(filename);
var pic = Picture.CreateFromPath(“c:\test\cover.jpg”;
var albumCoverPictFrame = new TagLib.Id3v2.AttachedPictureFrame(pic)
{
MimeType = System.Net.Mime.MediaTypeNames.Image.Jpeg
};
var genre = new List<string> {"Alternative"};
var composers = new List<string> {"Hodgson, Wilson, White, Rix and Baines"};
var artists = new List<string> { "Kaiser Chiefs" };
mp3.Tag.Album = Album.Title;
mp3.Tag.Title = albumSong.Song.Title;
mp3.Tag.Year = 2011;
mp3.Tag.Artists = artists.ToArray();
mp3.Tag.Performers = artists.ToArray();
mp3.Tag.AlbumArtists = artists.ToArray();
mp3.Tag.Genres = genre.ToArray();
mp3.Tag.Composers = composers.ToArray();
mp3.Tag.Track = (uint) albumSong.TrackNumber;
IPicture[] picFrames = { albumCoverPictFrame };
mp3.Tag.Pictures = picFrames;
mp3.Save();
The archiving of the final album uses a .NET specific library called DotNetZip http://dotnetzip.codeplex.com/. This was also really easy to use. Once the albums were ready for archiving, we would run the following code:
using (var zip = new ZipFile())
{
var filenames = Directory.GetFiles(“c:\temp”);
foreach (var filename in filenames)
{
var zipEntry = zip.AddFile(filename);
zipEntry.Encryption = EncryptionAlgorithm.WinZipAes256;
zipEntry.Password = “random password”;
}
AlbumFilePath = “c:\final\album.zip”;
zip.CompressionLevel = CompressionLevel.None;
zip.Save(AlbumFilePath);
CdnLocation = cdn.RelativePath + "\\" + ConfigSettings.ArchiveFileName;
}
It allowed us to use encryption on the final zip ranging from PkzipWeak to WinZipAes256. The level of compression is also configurable.
The custom album generator service needed be able to run in parallel. This was so we could scale up the amount of albums processed. We created a queuing system that allowed multiple processes to work off a queue of unprocessed requests. This system needed to be robust enough so that if any of the processing service failed, the exception was logged and the item was put back in to the queue ready to be picked up again.
The main site was written in MVC 3.0 using the Entity framework as an ORM. We utilised the same ORM for the data access layer for the windows service.
Now that we had a windows service up and running, it now begged the question, how many albums could it process? The process of generating an album ready for download is as follows:
- Copy selected mp3s to a temp folder
- Change the ID3 tags including embedding an image
- Archive the files in to a zipped format (with no compression – just to make single file)
- Move .zip to suitable location
- Clean up temporary files
- Finish transaction
Running on a desktop the whole process was taking between 6 – 11 seconds for each album with the processor running nearly at 100%. A days’ worth of processing would generate around 10,000 albums.
This also did not take in to account where the final album would be hosted from. We would want a Content Delivery Network (CDN) to deal with serving up content so there would need to be additional time to copy the files to the CDN.
As we had no idea how many albums we needed to generate a day, and there were no stats to use as this hadn’t been done before, we went for a figure of 60,000. This in hindsight was very optimistic, but far better to over-deliver.
We eventually went for a configuration of six processing servers, with a repository server to host the albums. This server was connected to a CDN. When all six servers were running at maximum capacity, we were processing an album every 3 seconds.
Key learning
The windows service has been running in the production environment very smoothly. In fact the 6 servers were overkill and the average time it took from user completing a purchase and being able to download album was 4 seconds. We weren’t sure how long this process was going to take, so instead of the user being able to wait on the fan page and then a download link appearing when ready, they had to wait for an email confirming album was ready, then click on a link. In hindsight, this could have been avoided to create a more seamless user experience. That said, the service was a great success and secured a huge amount of press coverage.
To the cloud!
At the time of development we were not 100% sure of the architecture of the final solution. So we experimented with a cloud computing service. Universal decided to go with a company called Venetrix to host their servers, so this route was scrapped. We were very impressed though with how easy it was to set up, configure and use the cloud. We would love to deploy a project in this manner in the future.One of the benefits of the cloud would be the ease of spinning up and spinning down processing servers as the load of requests fluctuated. So we did some testing with Rackspace – http://www.rackspace.com/cloud/After setting up an account, we created a cloud server and installed the windows service. There was a variety of configurations that all varied in price. Rackspace also had an API that allowed you easily to copy files to their CDN powered by Akamai.A 2 core server costs 0.16 cents an hour. This generated an album and copied to the CDN every 30 seconds. So each server could churn out around 3000 albums a day. So 20 servers would equate to $77 a day. 60,000 albums would equal 4.6TB @ 80MB per album! Storage costs with Akamai were:-11p per GB/month to store albums12p per GB uploaded.So to store the 60,000 albums generated for a month would cost $518 to store and $565 to download (assuming album is only downloaded once). This works out at 2p per album.These are all quite rough estimates, but you can kind of see some ballpark values. Could have been a good solution!
Summary
Once the site went live, the solution that we had set up handled the load exceptionally well. It was very easy to add/remove production servers as the load increased/decreased. Requests were in the queue for no longer than a few seconds before being processed. It was fast enough that we could have delivered the customers album in real time rather than notifying the user by email when album was ready for delivery. And finally the services were very robust. Overall, a big success!