Category Archives: Uncategorized

AV Foundation: Saving a Sequence of Raw RGB Frames to a Movie

An application may generate a sequence of images that are intended to be viewed as a movie, outside of that application. These images may be created by, say, a software 3D renderer , a procedural texture generator, etc. In a typical OS X application, these images may be in the form of a CGImage or NSImage. In such cases, there are a variety of approaches for dumping such objects to a movie. However, in some cases the image is stored simply as an array of RGB (or ARGB) values. This post discusses how to create a movie from a sequence of such “raw” (A)RGB data.

Continue reading AV Foundation: Saving a Sequence of Raw RGB Frames to a Movie

NSSavePanel: Adding an Accessory View

Cocoa’s NSSavePanel allows one to programmatically add essentially arbitrary interface elements and functionality to it, in the form of an accessory view. In this post, I show a very simple accessory view example: allowing the user to control the file type (that is, suffix) of the file to be saved. I’ll present this in two contexts: first, in a purely Objective-C usage; and second, in the case of using an NSSavePanel inside a C/C++ function. In the latter case, I show an example of using a selector in a separate object, to handle “callbacks”. This post is aimed at novice Cocoa programmers; experienced programmers looking to add a file type selection are encouraged to check out JFImageSavePanel or  JAMultiTypeSavePanelController. Apple’s Customizing NSSavePanel shows other uses for the accessory view.

Continue reading NSSavePanel: Adding an Accessory View

OS X: Launching Another Application Programmatically

Occasionally an application may require that another application be run. This other application may be some behind-the-scenes “helper” or auxiliary app, or it may be necessary for the user as part of a larger workflow. In this post, we go over some techniques for launching an application programmatically. In the process, I’ll go over a general method for passing parameters to a bundled AppleScript.

An Xcode project for this is available on github.

Continue reading OS X: Launching Another Application Programmatically

iOS Bézier Clock

Alt Text

I recently stumbled on Jack Frigaard’s Bézier Clock web page, which demonstrates his use of Processing.js to show an animated “digital” clock. He links to another page containing his Javascript code.

[Update: Jack’s original web pages are MIA, but can be found via the Internet Archive Wayback Machine here and here.]

I thought it would be fun to see if I could translate this into an iOS app; this project is the result of that effort. In truth, this is more of a transliteration than a proper translation…I converted it to Objective-C by creating equivalents to Jack’s classes, adding some UIViewControllers and UIViews, and pasting his code in. My goal was to try to simultaneously keep his code and algorithms as intact as possible, while writing fairly “proper” Objective-C. So, the resulting code is probably not quite what one would do if one started from scratch on iOS. Continue reading iOS Bézier Clock

This new “Apple SIM” could legitimately disrupt the wireless industry

Perhaps the most interesting news about Apple’s new iPad Air 2 tablet is buried at the bottom of one of its marketing pages: It will come pre-installed with a new “Apple SIM” card instead of one from a specific mobile operator.

If anything can disrupt the carrier marketplace, this is it.

It’s early, but it’s easy to see how this concept could significantly disrupt the mobile industry if Apple brings it to the iPhone. In many markets—especially the US—most mobile phones are distributed by operators and locked to those networks under multi-year contracts. People rarely switch operators, partially out of habit and satisfaction, but mostly because it’s annoying to do so.

Carrier lock-in is the bane of market competition. The ability to easily switch carriers and preserve your cellular identity, keeping not only your device but your existing number, without anything more than that device itself—this is the holy grail for carrier customers.

Via @theloop.

Excellent Sheep

I recently had the opportunity to attend a talk by William Deresiewicz, author of the recently published Excellent Sheep: The Miseducation of the American Elite and the Way to a Meaningful Life. This piece was prompted by his comments on the direction and decisions of today’s young people as they approach their majority and contemplate higher education.

If excellence is the goal, “Education” is the obstacle.

To the capable and the driven, who today account for a great number of the sons and daughters of Middle Class America, an elite education is the stairway to success. It is, in fact, the essence of the Middle Class Dream. While the American Dream is the broad belief that success can be achieved through hard work and perseverance, the Middle Class Dream applies that belief to the following premise: if the most successful people come from elite universities, then the best path to success is through those universities. It is the belief that an elite education is the road down which hard work travels to become success. Unfortunately, the numbers back this belief: according to the  U.S. Department of Education’s National Center for Education Statistics, higher education correlates strongly with higher income. The Center writes, “In 2012, young adults with a bachelor’s degree earned more than twice as much as those without a high school credential…”

A pragmatic parent’s response is obvious: send your kids to college to insure their future. Send them to the best college they can get into, because the bigger the name on their résumé, the better their prospects are in the job market. College is an investment—insurance against economic uncertainty—and, as in finance, the bigger the initial investment, the larger the return. The goal is no longer learning, the goal is achievement and the appearance of educational excellence. Education has become a system and, like any system, the most successful are those within it who play the game to their advantage. As Deresiewicz writes in his essay “Ivy League Schools are Overrated”, elite education is structurally opposed to intellectual curiosity: “Students are regarded by the institution as “customers,” people to be pandered to instead of challenged. Professors are rewarded for research, so they want to spend as little time on their classes as they can. The profession’s whole incentive structure is biased against teaching, and the more prestigious the school, the stronger the bias is likely to be. The result is higher marks for shoddier work.” Students avoid risk like the plague, meeting the requirements of courses but without the freedom to explore the content or develop a true understanding of it. Deresiewicz continues, “Once, a student at Pomona told me that she’d love to have a chance to think about the things she’s studying, only she doesn’t have the time.” This extends down into the admissions process and from there into high schools. Getting into an elite college is an act of perseverance: forego free time, abandon excellence, and throw yourself bodily into the process of gaming the higher education system and filling every box on the admissions offices’ checklists. I know this because some of my friends did it: they played the education system and were admitted to Harvard, MIT, Stanford, Princeton, and Pomona (among others; whether or not they enrolled there is another story).

Society believes that education produces excellent individuals. Excellent individuals are capable, principled, and interesting. Elite education, then, does not produce excellent individuals. It produces, to use Deresiewicz’s book title, “Excellent Sheep.” As he writes in the book, “Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.” People like this would only ever be called “interesting” by a psychologist studying unbalanced personalities.

What, then, does it mean to be interesting? We talk about well-educated people being interesting and engaging. They say and do interesting things. One of my father’s college professors once told him, “The most interesting people know the most interesting things.” Of course the most interesting people have the most interesting knowledge. But how do they get it? How do they become interesting? How do they achieve this most sought-after personality trait? Was it something they read? Was it something they heard? Was it the college they went to? People search after these answers, but never find them. I think it’s because they can’t be found. There is no book you can read to become interesting, no lecture you can attend. There is no college that can bestow it upon you at graduation, for no school can collect books that don’t exist and offer lectures that can never be heard.

If there is one thing I have learned in my meager time on this earth, it is that interesting is not a status you can achieve. Interesting is not a badge you can earn, a box you can check off, an award you can receive, nor an a rank you can attain. Interesting is a state of being. It is a process, continuous; it is the very act of being interested. “The most interesting people know the most interesting things,” not because they learn them to become “interesting”, but because they study that which they find interesting. Being interesting is being intellectually curious, exactly what our elite universities prevent. And conveniently, the most interesting people (by this definition) are also some of the most successful in life. They are excellent individuals, not just excellent sheep.


If you’re on Medium, this story was also published there.

 

iCloud Photo Library

Apple today announced iCloud Photo Library, allowing users to maintain a single photo library across all their devices with photos stored in their original format and resolution on iCloud:

Every photo, every edit, every album now lives in your iCloud Photo Library, easily viewable and consistent on all your iOS devices. Automatically.

Looks like Gruber was right on the money with his sources for iCloud Photo stuff seeing a major improvement in 2014.

It also looks like Peter Nixey has been vindicated, finally:

1. I want the canonical copy of my iPhoto library in the cloud. One iPhoto library in the cloud, many devices with access to it. I want to edit, organise and delete photos on any device and see the same changes on all other devices. No master/slave setup – just straight cloud access.

According to Craig Federighi, this is how it works. Unfortunately, it doesn’t seem like Apple read this next part:

2. You can charge me for this. I suggest $5/month. Maybe that’s a bit more than it costs you at the moment but that’s what I’m prepared to pay and we both know that you’ll do very well out of this in the long run. However for that I want unlimited space including for all of my videos. FYI that’s not what really I’m paying you for. I’m really paying you for the peace of mind that you’ve got my memories safe-guarded. I’m technically paying you for insurance. The utility this offers just the carrot that gets me over the hump of paying you.

Pricing matches iCloud pricing, and iCloud Photo Library takes up space in iCloud:

  • Free—Up to 5GB
  • $0.99/mo—Additional 20GB
  • $3.99/mo—Additional 200GB
  • Tiers up to 1TB

Personally, I think this whole pricing tier is a mistake. There are too many levels; it’s too complicated. Nobody should have to worry about keeping track of their data size, with the incumbent concerns of not exceeding their data cap. As Backblaze has shown, robust data storage is cheap enough to offer a single, unlimited price point. This is what Apple needs to do.

Compared to the overall price of the device and the price of service contracts for cellular coverage and data (for iPhones and iPads 3- and 4G), $5/mo is affordable. Apple doesn’t even have to operate at cost for this service, but Backblaze has shown that it can be done, for desktop customers who doubtless have many times the data storage requirements of iCloud users.

Apple: if you’re listening, do what Peter Nixey said and make paid iCloud Drive storage have two tiers. One free, up to something small like 5GB, and one paid, unlimited data for $5/mo.

Mac App Store: The Subtle Exodus

The Mac App Store can be better than this. It should be better than this.

Let me make it absolutely clear why I’m writing this. First and foremost, it’s because I deeply care about the Mac platform and its future, it pains me to see developers abandoning it. The Mac App Store can be so much better, it can sustain businesses and foster an ecosystem that values and rewards innovation and high quality software. But if you talk to developers behind the scenes or explore the Mac App Store, you’ll find something completely different.

Via Daring Fireball.

FFmpeg: convert RGB(A) to YUV

I recently had a need to convert a series of rendered images  generated in an application to a movie file. The images are rasters of raw 32-bit RGBA values. A typical solution to this problem would be to dump the images to disk, and then use a command-line program,  such as ffmpeg, to convert them to the desired movie format (in this case, MPEG-2).

In my particular usage scenario, this simple solution was not an option for various reasons (nearly unbounded disk usage, user interface issues, etc.). Another option is to use the FFmpeg API to encode each frame’s raw RGBA data and dump it to a movie file. Unfortunately, I was unable to find a codec that would directly convert from the raw data to the desired movie format.

A web search turned up a potential solution: http://stackoverflow.com/questions/16667687/how-to-convert-rgb-from-yuv420p-for-ffmpeg-encoder

It turns out you can convert RGB or RGBA data into YUV using FFmpeg itself (SwScale), which then is compatible with output to a file. The basics are just a few lines: first, create an SwsContext that specifies the image size, and the source and destination data formats:

AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_MPEG2VIDEO);
AVCodecContext *c = avcodec_alloc_context3(codec);
// ...set up c's params
AVFrame *frame = av_frame_alloc();
// ...set up frame's params and allocate image buffer
SwsContext * ctx = sws_getContext(c->width, c->height,
                                  AV_PIX_FMT_RGBA,
                                  c->width, c->height,
                                  AV_PIX_FMT_YUV420P,
                                  0, 0, 0, 0);

And then apply the conversion to each RGBA frame (the rgba32Data pointer) as it’s generated:

uint8_t *inData[1]     = { rgba32Data };
int      inLinesize[1] = { 4 * c->width };
sws_scale(ctx, inData, inLinesize, 0, c->height, 
          frame->data, frame->linesize);

One important point to note: if your input data has padding at the end of the rows, be sure to set the inLineSize  to the actual number of bytes per row, not simply 4 times the width of the image.

If you’re familiar with the FFmpeg API, this info should be sufficient to get you going. The FFmpeg API is quite extensive and a bit arcane, and even something as functionally simple as dumping animation frames to a movie file is not completely trivial. Fortunately, the FFmpeg folks have provided some nice example files, including one that demonstrates some basic audio and video encoding and decoding: https://www.ffmpeg.org/doxygen/2.1/decoding__encoding_8c.html

I took the source for the video encoding function and hacked it up to incorporate the required RGBA to YUV conversion. The code performs all the steps needed to set up and use the FFmpeg API, start to finish, to convert a sequence of raw RGBA data to a movie file.  As with the original version of the code, it synthesizes each frame’s data (an animated ramp image) and dumps it to a file. It should be easy to change the code to use real image data generated in your application. I’ve made this available on GitHub at:

https://github.com/codefromabove/FFmpegRGBAToYUV

For Mac programmers, I’ve included an Xcode 6 project that creates a single-button Cocoa app. The non-app code is separated out cleanly, so it should be easy for Linux or Windows users to make use of it.

Other input formats

The third argument to sws_getContext describes the format/packing of your data. There are a huge number of formats defined in FFmpeg (see pixfmt.h), so if your raw data is not RGBA you shouldn’t have to change how your image is generated. Be sure to compute the correct line width (inLinesize in the code snippets) when you change the input format specification. I don’t know which input formats are supported by sws_scale (all, most, just a few?), so it would be wise to do a little experimentation.

For example, if your data is packed 24-bit RGB, and not 32-bit RGBA, then the code would look like this:

SwsContext * ctx = sws_getContext(c->width, c->height,
                                  AV_PIX_FMT_RGB24,
                                  c->width, c->height,
                                  AV_PIX_FMT_YUV420P,
                                  0, 0, 0, 0);
uint8_t *inData[1]     = { rgb24Data };
int      inLinesize[1] = { 3 * c->width };
sws_scale(ctx, inData, inLinesize, 0, c->height, 
          frame->data, frame->linesize);