BrewDog brewery raising $50M from the crowd to secure U.S. expansion

BrewDog brewery raising M from the crowd to secure U.S. expansion

With plans for a new brewery and aggressive expansion in the biggest craft beer market in the world, Scottish brewery BrewDog* turned to BankRoll to raise investment to, er, bankroll its U.S. expansion plans, with a hefty $50 million round of equity crowdfunding.

The brewery has secured 42 acres of land in Winchester, Ohio (don’t worry, I had to look it up, too), where it is building a 100,000-square-foot brewery that is to become the epicenter of its U.S. operations, with a taproom, a restaurant, a visitor center and — of course — an industrial-sized brewery to start filling bars and bottle shops around the land.

To fund the expansion, BrewDog is launching its equity crowdfunding campaign, for the first time inviting U.S. investors to join its existing 40,000 European investors.

This isn’t the company’s first foray into tapping its drinkers for cash; since being founded in 2007, the brewery has leveraged its against-the-grain branding in every aspect of its business, including how it raises money. Over the past decade, the company has run four “Equity for Punks” crowdfunding campaigns in Europe. The most recent campaign closed in April, raising £19 million ($25.3 million) at a valuation of $400 million, among some murmurings that it might be tricky to get a return on that particular investment.

Drunk on controversy

Tasty? Who knows, I've never tried it. Tasteful? Well...

Tasty? Who knows, I’ve never tried it. Tasteful? Well…

The brewery has been operating in the U.K. for a number of years, and put itself on the map with a series of nutty PR stunts, including launching “Never Mind the Anabolics,” a beer that would have gotten athletes in trouble had they drunk it; a $600 bottle of beer sporting 55 percent alcohol served inside a stuffed squirrel; and generally making a high-profile nuisance of itself whenever it steps on the toes of regulators, which, as you might have already guessed, it does often.

With great success, I hasten to add; the BrewDog brewery and its eponymous brewpubs are popular and apparently very profitable indeed, according to its own reports. It describes itself as “Europe’s largest craft brewer,” raising some eyebrows among beer lovers, wondering how big a brewery can get before they’re no longer creating craft beer.

Taking on the U.S. crowd-equity world

Beer is fun, and BrewDog’s IBU-laden beers are sure to mesh nicely with the IPA-loving American market — but what’s really interesting to me is the mechanism for raising money. There is no shortage of investors in the food and drinks market (and there have been some huge acquisitions in this space recently) — so turning to the crowd for equity investing is a strong move; it is certainly injecting some of the spirit we’ve been seeing in European crowd-equity investing into the more tightly regulated (and, frankly, more uptight) U.S. market.

I hope for BrewDog’s sake, however, that it’s able to stay on the right side of the (far more stringent) investment laws in the U.S. Brewing another 0.5 percent alcohol “Nanny State” beer to “apologize” for the 32 percent alcohol Tactical Nuclear Penguin beer isn’t going to cut it as an apology on this side of the pond.

In any case, the company’s pitch for cash is below, and if you’re a sophisticated investor interested in this space, this would be a good place to start your research.

[embedded content]

* Disclaimer: I participated in an early equity crowdfunding round, making me a shareholder in BrewDog.

Source: TechCrunch

Threesome app 3nder renames to Feeld after Tinder lawsuit

Threesome app 3nder renames to Feeld after Tinder lawsuit

The app that brings two hearts plus one
together with a mobile app,
Its name was one but now is gone
with a mighty legal slap.

Too close for Tinder’s rights of copy
its name was fought but then repealed.
Changing after getting stroppy
from Thrinder to the saucy ‘Feeld.’

Already on the phones from Apple
the app now, too, on And-er-oid.
Ready for some sweaty grapple
worthy of the dreams of Freud.

Source: TechCrunch

Staaker drone follows you as you attempt to break your neck

Staaker drone follows you as you attempt to break your neck

The $1,200 Staaker drone does away with the traditional remote control, replacing it with a wearable armband. Instead of controlling the drone like a remote-controlled plane, you put the Staaker into one of several follow modes and it will keep you in view as you careen down a mountainside or flip upside-down on a surfboard. I had a chance to try it out and was mightily impressed.

Folded up into its protective pouch, Staaker is small and light enough to throw in a backpack.

Folded up into its protective pouch, Staaker is small and light enough to throw in a backpack.

The new drone is foldable into a relatively small package, perfect for bringing into the mountains for filming extreme sports. When it’s unfolded, the landing gear folding down cleverly locks the arms into place and the drone is ready to fly.

The drone can be flown as a normal drone, using simple up/down, closer/further away from me, turn left/turn right controls: Useful for establishing shots and B-roll, but the drone is pretty clumsy to fly in this mode. What it is far better for, however, is its follow modes, where you set up the drone’s mode, after which it’ll just keep following you around as you pay attention to your extreme sport activities.

Staaker’s drone uses a 3-axis gimbal and a GoPro for the actual filming. That feels like a step back from some of the live-view drones you see these days, until you remember that there’s no need to monitor the footage as you’re shooting. Put differently: If you have time to look at what the drone is doing, you’re probably not skiing/driving/skateboarding/falling on your face hard enough.

The drone that’ll stalk you

Staaker is controlled from sturdy arm-mounted remote control / locator. It looks pretty big, but that makes sense: it's designed to be possible to operate wearing wetsuit or snowboard gloves.

Staaker is controlled from a sturdy arm-mounted remote control/locator. It looks pretty big, but that makes sense: It’s designed to be possible to operate wearing a wetsuit or snowboard gloves.

Both the armband remote and the drone have a series of sensors built-in, designed to keep the drone pointed at you every step of the way.

The drone has three main modes: There’s an auto-follow mode, where it keeps its position stable relative to the wearer of the armband (i.e. behind, ahead, above or to the side of the athlete); there’s a circle mode, where it circles in slow (or fast) circles around the armband wearer; and there’s a stationary mode, where it hovers in place, rotating to keep the athlete in view.

Is it a plane? Is it a bird? No, of course not, it sounds like neither. It sounds an awful lot like a drone, though. What could it be?

Is it a plane? Is it a bird? No, of course not, don’t be bloody daft. It does sound an awful lot like a drone, though. What could it be?

The most interesting thing, in my mind, is that most drones are there to film others. Staaker is different, and, in the process, is turning the industry on its head. By creating a drone you can use to film yourself, it is broadening the target audience from eager filmmakers to everyone who is involved in sports and other physical activities that are worth filming. It’s hard to say whether this subset of people are in the market for selfie-drones on steroids, but judging from some of the footage I’ve seen Staaker’s drones create, I can easily see this being huge. I can’t wait to see what happens when this gets in the hands of extreme sports athletes.

The one downside of Staaker’s drone is that it doesn’t currently have obstacle avoidance, which may turn out to be a problem for some users. It certainly turned into a problem when we were filming TechCrunch’s video report on Staaker; as we were filming some additional footage at the end, a seagull crashed into the drone (or vice-versa?) disabling it and sending it crashing into the highway. The lesson to learn here is to keep the drone away from wildlife and populated areas, perhaps.

From extreme sports to the world

Hey, don't turn around now, but I think we're being followed.

Hey, don’t turn around now, but I think we’re being followed.

Based in Norway, the company claims its location is perfect for creating this type of startup.

“We have access to a lot of smart engineers,” says Ole Jørgen Seeland, the company’s CEO, “but far more importantly, Norway is a natural destination for extreme sports, which means we have a huge amount of opportunities for testing. We have destroyed a lot of drones along the way, but that’s OK: These machines are built for use in extreme situations. Every damaged drone helped us refine the product to make it stronger — and to help us gauge which spare parts we need to stock.”

Having the advantage of only ever having to film forward helped determine camera position and enabled the team to put together a boldly designed drone.

Having the advantage of only ever having to film forward helped determine camera position and enabled the team to put together a boldly designed drone.

“We built 50 prototypes to get to where we are today,” say Seeland, explaining that the company has put its drones through hell to ensure they work as intended. “We had 20 different iterations on the design along the way.”

On the topic of Norwegian — the founders of Staaker can’t quite make up their minds as to how the company’s name is pronounced; I’ve heard both “stacker” and “stalker.” The latter might make the most sense for native speakers: “Staaker,” when pronounced in Norwegian, would sound exactly like the English word “stalker,” Oh multi-lingual puns, how we love thee.

Pre-orders starting today

Lily -- Staaker's biggest competitor -- appears to be under investigation by Indiegogo's Trust and Safety team. So that's reassuring.

Lily — Staaker’s biggest competitor — appears to be under investigation by Indiegogo’s Trust & Safety team. So that’s reassuring.

Starting today, the drone can be pre-ordered, which will scare off a few of those who’ve backed drone startups in the past.

ZANO’s extremely public failure on Kickstarter triggered the crowdfunding platform to pay for an independent journalist to investigate what had happened, and the Lily drone (probably the biggest competitor from Staaker’s point of view), which was funded on Indiegogo to the tune of $34 million, is currently “under review by Indiegogo’s Trust & Safety team.”

“We know we can deliver,” says Seeland, CEO and founder at Staaker, revealing that the company is aware of some other failures in its market. “Our drone will be manufactured by Foxconn. We figured if it’s good enough for Apple, it’s good enough for us, too.”

I’ve seen a lot of prototypes in my time, and I’d be inclined to give Seeland and the Staaker team the benefit of the doubt; the prototypes we flew for TechCrunch’s video were of significantly higher quality than the ones I usually see as a company is starting its pre-order campaigns. Of course, that doesn’t necessarily mean that the company will be able to adhere to its claimed December shipping date — but solid prototypes and using Foxconn for manufacturing goes a long way to assuage the automatic knee-jerk reaction I have when I hear “drone pre-order campaign.”

The drone is available for pre-order from the Staaker website for $1,200 starting today. Once the company ends its pre-orders, the company says the price will increase to $1,800 or so.

Want to see some more footage of it in action? Of course you do — the company’s launch video is embedded below.

[embedded content]

Source: TechCrunch

BBC: And then we strapped a helicopter rig to an elephant

BBC: And then we strapped a helicopter rig to an elephant

Imagine being part of the BBC’s natural history unit, traveling the world to create some of the world’s most beautiful documentaries. Sounds like a dream job, right? I sat down with Huw Cordey, the producer on a ton of the Beeb’s best-loved shows, to find out more about the technology and gadgets the team deploys to capture the beasties in action.

“I’m sorry, sir, your hand luggage must fit into this gauge.”

“People are always developing new equipment,” Cordey says, half grinning, half sighing, as I imagine him waving at an enormous pile of Peli cases stacked in the corner of his no doubt stacked-to-the-rafters office, “which is perfect for us. If you think about it, wildlife hasn’t really changed what they have been doing since we started filming nature documentaries. Instead, we have to come up with new ways of telling their stories.”

Coming up with innovative things is kind of his thing. For the most recent show Cordey worked on — BBC’s The Hunt, which is currently showing on a BBC channel near you — the concept was to try to capture how animals hunt in the wild. Which isn’t that tricky if you’re trying to film a spider rolling up a fly into a little ball, perhaps, but it’s a very different kettle of orcas when you’re talking about large animals careening along the savannah at 55 mph.

Is that a Cineflex on the side of your Landcruiser, or are you just happy to see me?

Is that a Cineflex on the side of your Land Cruiser, or are you just happy to see me?

“My favorite piece of equipment we used on this series were the Cineflex gyroscopic mounts,” Cordey says, referring to the super-stabilized, basketball-sized, 85-pound mounts usually attached to helicopters.

Yes, helicopters. If that sounds a little bit James Bond, and you’re getting all sorts of action sequence emotions tingling your spidey senses, check out the video below. It’s pretty epic.

[embedded content]

Of course, the team did use the Cineflex mounts as intended — strapped to the side of a helicopter — but they also used the camera extensively in other situations.

Well polar bears don't film themselves, you know!

Well polar bears don’t film themselves, you know!

“We were able to mount the camera on a truck and follow the animals as they were hunting,” says Cordey. He explains that the team did have to be a bit more careful than usual; altogether, the gear they tied to the truck costs around half a million dollars.

All the hours playing on the PlayStation finally paid off.

— Jamie McPherson

He adds, soberly; “It was a completely new way of following wildlife; all the bounces were taken out of the footage by the technology, creating a smooth shot from beginning to end. We can follow the chase as fast as we are able to drive, but if we’d hit a porcupine borough along the way, we’d have been buggered.”

Filming in this manner also meant that the team was able to film a full hunt “for real,” rather than doing it the way it had been done in the past. Which… used to work out pretty much exactly the way you’d imagine. In some nature documentaries, you might notice that the animal that is caught in the end looks slightly different or is a bit bigger or smaller than the animal at the start of the sequence. Of course, that is because they filmed different hunts, often on completely different days and in different locations.

Because why wouldn't you attach a stabilization mount designed for helicopters to the biggest land animal in the world.

Because why wouldn’t you attach a stabilization mount designed for helicopters to the biggest land animal in the world?

One innovation was to build a mount enabling the team to film… from an elephant. Which is every bit as bonkers as it sounds, but it isn’t completely “just because we can” — it turns out that it’s the perfect way to film tigers, because unlike drones, helicopters and juicy camera-men, tigers basically ignore elephants.

[embedded content]

In the above video, the team explains how they came up with the idea of creating an elephant-mounted camera. 

“Being able to film continuously through the hunt is a complete game-changer for us,” Cordey admits, with more than just a little bit of awe in his voice. This, I can tell, is bloody exciting, even for the 20-year veteran of nature documentaries.

“We had an amazing camera man, Jamie McPherson. He’s really taken to all of the tech, and I had to laugh one night after a shoot, when he pointed out that all the hours playing on his PlayStation finally paid off,” Cordey says, but looking into the tech more closely makes it clear that he’s only partially joking. Operating a Cineflex is an awful lot like playing a video game. Instead of throwing a camera on your shoulder, you have a screen and a remote control, meaning a slightly different skill set is needed to operate the new kit.

An awful lot like playing Playstation... If your Playstation cost half a million dollars.

An awful lot like playing PlayStation… if your PlayStation cost half a million dollars.

Bring on the drones

New technology is a key aspect of the series, enabling whole new aspects of storytelling.

Elephants run a mile immediately when a drone comes near.

— Huw Cordey

“I love drones; we use them all the time,” Cordey explains, pointing out that the the Panasonic GH4 is a particular favorite. It weighs very little, but the quality is good enough for use in broadcast, which means it is perfect for sending up with a drone.

“We do have to be very careful, though; drones sound like a swarm of angry bees, which scares some animals, so we can’t get too close,” Corey says, explaining that the team uses drones especially for overview shots and landscape shots. “Elephants run a mile immediately when a drone comes near.”

Despite their shortcomings, drones are a favorite and the team often carries drones, launching them whenever they’re needed, rather than getting out a jib arm or calling in a helicopter.

Consumer-grade equipment, broadcast quality footage

The other big change over the last 20 years is how good prosumer equipment has become. In the past, you needed specialized equipment in order to shoot in high def for television. The team does obviously still use high-end equipment a lot of the time, but commercially available equipment makes it much easier and cheaper to shoot shows today.

“For The Hunt, we used Camblock and Kessler sliders a lot of the time. For the time-lapse sequences we often shoot with Canon 5D MK III cameras and for low-light, Sony’s α7S has fantastic abilities at night,” says Cordey. “We also use the tiny Flare cameras a lot; they were invaluable for shooting time-lapse sequences of army ants.”

[embedded content]

If you haven’t seen what Sony’s α7s can do in low light, check out the above short film, shot exclusively by moonlight. If that’s not impressive to you, perhaps it’s time to go for a walk, you’ve been on the internet for too long. 

“Ultimately, you can’t say that a $2,000 camera delivers the same quality as a $50,000 camera, but it’s horses for courses. If we can use more affordable cameras, that’s fantastic,” Cordey explains.

Another big advantage is the availability of camera traps — much like the ones offered commercially by Camptraptions, where the team has built its own gear, powered by a bank of batteries.

“Being able to leave a camera out for months at a time makes it possible to capture scenes that we wouldn’t have been able to otherwise. For very rare animals like leopards, it’s the best way,” says Cordey. Makes sense; leaving a $50,000 camera and a very expensive camera man in the forest for a few months gets very expensive very quickly, but it’s worth taking the risk with a couple of grand’s worth of equipment on the off-chance that you capture a shot worthy of using in the series.

Huge leaps in technology

The past 20 years have seen a huge leap of technology and the natural history unit veteran reflects on what has changed.

You can’t use white light on most animals.

— Huw Cordey

“Twenty years ago, you couldn’t film at night without lights and helicopters needed to get within 50 meters of the animals,” Cordey explains. That’s a problem: nocturnal animals shy away from light and you can imagine what happens if you’re casually grazing along, minding your own business when suddenly a 10,000-pound inferno of sound, wind and scary starts following you around.
A lot of the best new tech has come from both commercial outfit and military applications.

There have got to be easier ways of taking a selfie.

There have got to be easier ways of taking a selfie.

“Infra red, thermal cameras and low-light tech all came from the military,” Cordey lists. “They have deep pockets and all the time in the world. Where possible, we take what they’ve done and adapt it for ergonomics so we can bring it to where we need it to be to capture the animals”

Security technologies have come in handy more than once, too.

“You can’t use white light on most animals,” Cordey says. “But filming in IR gives a pretty nice feel. Best of all, a lot of wildlife don’t see the IR light sources, so we can film them as much as we like. The downside is that it’ll all be in black and white. For some sequences, we’re now thinking about using the next-generation Sony A7 for shoots late in the evening or early in the morning.”

Still some ways to go

Cineflex? More like Eleflex.

Cineflex? More like Eleflex.

Not every problem has been fully solved. Camera equipment manufacturers, if you’re paying attention, here’s how to make your next generation of sales.

“Every year, drones get better,” Cordey says, but adds that he does have a wish list for improvements. “With a helicopter, we can fly for a couple of hours. For drones, we have to bring them back within six to seven minutes, which isn’t enough. I also wish they were quieter, so I can bring them closer to the animals, and I’d love better stabilization and zoom lenses.”

Other tech where the BBC is butting against the edges of what’s technically possible is filming at night. A lot of nocturnal animals haven’t been properly documented and I got the very distinct sense that Huw Cordey and his team would love to do a whole series focusing exclusively on those that only come out at night.

“Give me a gyro-stabilized long-lens camera on a silent drone and you’ve hit the jackpot,” Cordey laughs.

If you want to see The Hunt in full, tune in to BBC America, Sundays at 9/8c.

Charging startups to apply to an accelerator is exploitative and dumb

Charging startups to apply to an accelerator is exploitative and dumb

Yesterday, a little birdie told me about Nextt, an accelerator (probably more accurately referred to as an incubator) for early-idea-stage startups. The idea is to take a napkin-stage idea and turn it into an MVP of sorts in six weeks. Not a shabby idea, but the insidious thing here is that Nextt charges $100 per idea. Not to participate, mind you — to put in an application.

Ideas are cheap

It's nice of Nextt to put an animation of their bank manager on their website.

The Nextt website seems awfully cheerful. Calm down, chap, you’ll wear yourself out.

Starting a startup is hard. It’s daunting. And especially if you haven’t been through the mill a few times, it’s really difficult to know where to even start. Of course, it all begins with the spark of an idea, but if you’ve ever been in a room full of entrepreneurs, you’ll know that any moderately smart person can churn out 50 problems worth solving and 15 viable business ideas before breakfast.

What I’m trying to say is that ideas are cheap. Or rather, they used to be, until Nextt got their paws on them.

Apparently aimed at first-time entrepreneurs and people dipping their toes in entrepreneurial endeavors for the first time, Nextt’s website comes across as a pitch-deck factory.

The outcome of the program, they write, will leave participants with “a sleek highlight reel,” an “opportunity to connect with VCs” and some “testimonials from your project advisors.”

“Our advisor community is made up of designers, data scientists, developers and attorneys,” says the founder of the Nextt accelerator, Ajay Rajani. “These advisors analyze data, write code, design mockups and dedicate many hours to the program in order to provide potential entrepreneurs the opportunity to validate their idea in a cost-effective way.”

“A fair and more accessible way to finance our program”

I don’t want to be disparaging of the deliverables of the program. Undoubtedly, all of those things are of great value to a startup. My issue is that it seems iffy. The startup world is meant to be a place where experienced entrepreneurs are taking care of the newcomers, not exploiting them in the process.

If you’re one of the 87 percent of founders who don’t make the cut, it’s hard to argue you received great value for your money.

“[The $100 application fee] signals that the applicant is serious about their idea [and] makes a commitment to their idea and our expert team and advisor community,” says Rajani. “We see an application fee as a fair and more accessible way to finance our program, and the implicit value of our program is objectively and exponentially more.”

Nextt’s founder tells me that the company received 21 applications for its first invite-only cohort, before accepting four (19 percent) of the companies. For its second cohort, it received 42 applications and accepted another four (10.5 percent). It is currently trying to recruit an additional 5-15 successful applicants for a September cohort.

But… Nextt can see the idea even if you don’t pay

A troubling aspect of this is that the application form is a three-step process in the shape of a Typeform form. When I first heard of the program, my first port of call was to have a closer look at the sort of questions the accelerator asks its participants. I clicked on the Submit an Idea link in the menu, then the Jump to Form button.

On the first page, I agreed that Nextt won’t sign an NDA. On the second page, I filled in all the details about my idea, the problem it solves and a competitor analysis. On the third page, I was surprised by a payment form.

This is the screen that meets you after you've already filled in all the details about your idea.

This is the screen that meets you after you’ve already filled in all the details about your idea.

I agree that perhaps I should read more carefully than I did before pitching my next drone startup, but it’s worrying there are several ways through the website (including the one I took) where you won’t see a warning that this is a paid-for application until you hit the payment part of the application form. You’ve already agreed to not signing an NDA, but the data from page two — where you share the idea of your startup — is saved. If you at that point decide to not pay, Nextt still has access to your idea.

Nextt keeps “a small stake” in projects that work out, but I couldn’t find anywhere on its website what that means. The most obvious place to go looking — the company’s terms and conditions — doesn’t appear to exist.

If you’re panicking, wondering what is happening to your idea at that point, it is not particularly reassuring to discover that the company’s terms and conditions link doesn’t work; it simply points to the site’s home page.

As I mentioned in the beginning of this article, ideas are cheap, so Nextt having access to a startup’s ideas isn’t necessarily a big deal… But the problem is that doesn’t matter, because the target audience — very new entrepreneurs — don’t know that; they think their idea is worth its weight in printer ink. Which I think means that they are far more likely than most to reach for their wallets to cough up the $100.

“Typeform by default collects data from all form participants, including those who do not fully complete the application process,” says Rajani. He claims that Nextt “does not review the application in depth” until a fee has been collected.

“Justified by the return on investment if you get into Nextt” — but no word on what happens if you don’t.

On the site, Nextt points out that it chooses to charge an application fee to ensure the founders’ commitment. The company claims it’s cheap and that it’s totally worth it, if you are accepted into the program.

I completely agree — if Nextt delivers on everything it promises, $100 is a bona fide bargain, but only if you’re selected to participate. If you’re one of the 87 percent of founders who don’t make the cut, it’s hard to argue you received great value for your money.

“Should an idea not be accepted, Nextt provides the applicant with a summary of their idea’s strengths and weaknesses,” Rajani explained to me, clarifying the accelerator’s position on the matter. The accelerator also offers feedback and suggestions on what an applicant could try next.

Some sort of pay-to-enter competition?

Charging someone to apply for an accelerator — not to participate, just to apply — feels really exploitative. I have no problem with companies offering a service and then charging for it. That’s how businesses are supposed to work. Charging for a possibility of being able to work with an accelerator is daft.

“The $100 application fee is clearly disclosed on the apply page and we have not yet received a single complaint from applicants,” Rajani says.

My feeling is that at this point, you may as well just call it a pay-to-enter competition. If we do, there’s a problem, too: By not posting terms and conditions or proper contact information, Nextt’s site doesn’t appear to comply with the laws for operating these kinds of competitions.

So, what’s the solution?

The question is whether inexperienced would-be startup founders would be willing to pay $100 to enter a competition for someone to tell them whether or not their idea has merit. If the choice is between spending $100 on a 13 percent chance of getting some help, or spending the same amount of money on this 13-book reading list, for example, I know what my choice would be.

From the numbers the founder gave me, it looks like Nextt has had 63 applications to date, and accepted eight of them. That’s about $790 per idea — maybe a more honest way of doing this would be to make applications free and charge the startups $800 if they are selected to participate. It would certainly feel a lot less exploitative.

Accelerators: Don’t charge for applications. You’re better than that.

Source: TechCrunch

CoFoundersLab gobbles up FounderDating because 1+1=3

CoFoundersLab gobbles up FounderDating because 1+1=3

Got a bright idea for a company, but need a co-founder? You’re about to start walking down a really complicated path, fraught with dangers behind every turn. Until very recently, there were two companies to help you along: CoFoundersLab and FounderDating. Until the former acquired the latter, that is, continuing CoFoundersLab’s modus operandi of acquiring its competitors

Is your future CTO hiding behind one of these buttons?

Is your future CxO hiding behind one of these buttons?

Both sites were successful in their own right and today, the two communities start the migration process that will lure the FounderDating customers over onto the CoFoundersLab platform over the next few months, trebling the size of CoFoundersLab’s current user base.

“There were things CoFoundersLab excelled at and things that FounderDating did better than us,” says Alejandro Cremades, co-founder and executive chairman of Onevest, which operates CoFoundersLab, alongside with the 1,000 Angels matchmaking site for angel investments. He subscribes to the vision that a startup is more than the sum of its parts and emphasizes that he feels the combined communities will be far better than either on its own. “Combined, we will truly be a great resource for the startup community.”

There will undoubtedly be some shake-ups for both companies, as the New York based acquirer and theSan Francisco-based acquiree find ways of working more closely together.

“Engineering and related staff are staying on, as will our city ambassadors,” says Jessica Alter, founder and CEO of FounderDating, outlining the plans for the future. “I will be staying on as an advisor to the company.”

The new CoFoundersLab site is designed to make it even easier to find the perfect cofounder.

The new CoFoundersLab site is designed to make it easier to find the perfect co-founder.

To celebrate the acquisition, CoFoundersLab is also launching a new site today, starting with bringing the tremendously successful FounderDating discussion boards under the CoFoundersLab umbrella.

The most recent purchase continues the company’s aggressive strategy to buy out its competitors to further solidify its place in the market. In 2012, CoFoundersLab bought out and and, in 2014, the company snatched up Bizpora, a networking site aimed at entrepreneurs.

FounderDating was originally launched in 2012 by Jessica Alter. It is best known for its namesake concept of “dating” potential cofounders before tying the proverbial cap table knot.

Shiny new learning center

The new site also further shines the spotlight on CoFoundersLab’s Learning Center, which was launched last week. Its aim is to help startup founders make new and innovative mistakes, rather than repeating the same old ones over and over again.

The price tag does give me pause, however — for a platform that is aimed at finding co-founders and early-stage investment, I’m not completely sure how to feel about a thousand-dollar annual price tag to get access to the courses offered by the Learning Center. It seems like an awful lot of money for companies in a stage where a grand buys an awful lot of ramen noodles to stretch out a company’s runway.

Source: TechCrunch

The new Glif smartphone mount is the smartest of its kind

The new Glif smartphone mount is the smartest of its kind

Launched just 18 months after Kickstarter itself, the original Glif was an early beacon of what was to come for the crowdfunding platform, with an innovative tripod mount for smartphones unlike any other. Studio Neat has taken its sweet time to launch the sequel, but the new Glif was worth the wait; it’s a marvelously clever piece of engineering. I spent some time with the prototype to take a closer look.

New Glif on Kickstarter - 5-_DSC2128

The prototypes I borrowed from Studio Neat, the company behind Glif and a series of other products, were 3D printed and will obviously not look like the final product — but given that this is a crowdfunding campaign, I did want to take a closer look at the real product before writing it up. Even as a 3D-printed piece, the new Gif was fully useable, which is a firm testament to the design and innovation that went into creating this little thing.

The mechanism that locks your phone into New Glif is pretty damn clever.

The mechanism that locks your phone into the new Glif is pretty damn clever.

The device itself is deceptively simple, but that’s also its beauty. It’s a spring-loaded clamp with foam padding on the inside (both to protect your phone from getting scratched and to ensure a firm grip).

To use it, simply stretch the spring to slip your phone into the clamp. Then use the lever to secure the phone into place.

The lever both locks the phone and adds a tiny bit of extra tension to the clamp, to secure a firm grip on the smartphone; it’s simple, but it works so well I’m surprised we haven’t seen this used before.

glif attachments

With no fewer than three tripod mounting points, you can use the Glif for all manner of things; you can use it to mount additional equipment to your phone, for example, turning it into a mobile video recording studio, or simply to reconfigure the mount in any direction you like.

It is hard to over-praise the design of this product, especially if you’ve tried a few phone clamps in the past. There are some quick clamps out there, but they don’t hold your phone very well. There are some very firm clamps, too, but they involve a lot of adjusting screws to get them to play nicely with your phone and to mount them in place. Studio Neat, in offering the best of both worlds with the newest incarnation of Glif, is onto a winner here.

New Glif on Kickstarter - 5-_DSC2138

The Glif mount holds the phone firmly in any orientation and supports with confidence even the chubbiest of phone cases on the beefiest of phones. At $25, it’s a no-brainer to anyone who does video work on their smartphones.

Because of its clever design, the same mount can hold devices ranging from 58mm (2.25″) to 99mm (3.9″) in width, although the actual width of the phones it can handle depends on the thickness and shape of the case used on the phone.

New Glif on Kickstarter - 5-_DSC2130

For the Kickstarter campaign at its higher $50 backing level, Studio Neat is also marketing a turned hardwood handle and a wrist strap; properly old-school, but it feels great in the hand and, while simple, does a great job of making the phone just that little bit easier to handle.

New Glif on Kickstarter - 5-_DSC2141

Being able to mount the phone on a mini tripod — or, indeed, any type of tripod — is spectacularly useful, especially to videographers who want to add a bit of extra stability to their work; the Glif is a strong contender for being the right tool for the job.

Studio Neat indicates the new Glif will be ready to ship early next year, at which point it will no doubt be available for sale from its website. If you want to be among the first to try it out, its Kickstarter campaign is running for another 25 days or so.

Source: TechCrunch

Postmates aqui-hires team behind controversial Famous game

Postmates aqui-hires team behind controversial Famous game

Eager to throw its product development into high gear, Postmates snags the team from Hey! inc to boost its engineering bandwidth. Hey is best known for its ludicrously viral game Stolen, which was shut down and later reborn as Famous.

“The Hey team was introduced to us by Nabeel, who’s an investor in Postmates,” says Sean Plaice, CTO and co-founder at Postmates. “He brought to our attention that the Hey team might be looking for a new home.”

Hey was founded and headed up by Siqi Chen. He tells me that the company has been struggling over the past four years, trying — and never quite succeeding — to find a product/market fit. At Postmates, Chen takes the mantle as Senior Director of Product.

Neither company is eager to discuss the exact terms of the deal.

Loading up on engineering firepower

“The main thing we add to the mix is engineering firepower,” Chen says, especially highlighting his skills as a growth-stage engineer. “I was the head of product management at Zynga China for a couple of years. Postmates is just starting to scale aggressively and it feels as if the timing is perfect for us to join.”

The process went at breakneck speed, with just about a month passing from the start of the conversation until the team had a new home under Postmates’ roof.

“We started talking with Postmates late June,” laughs Chen, so it all happened very quickly.”

“We already had a great team, but we wanted someone who can raise the bar and take our growth and product to the next level”, said Plaice. “Siqi and the team he brings with him make this decision a no-brainer.”

Famous looking for a new home

The Hey team are taking on their new roles effective immediately. Postmates, continuing its focus on delivery services rather than branching out into gaming, decided to not take on any of the IP.

“We will continue to operate Famous, but we’re looking for a good home for it as we speak.” says Chen, eager to not let the company’s creation die an untimely death. “We are talking to a number of interested parties.”

Postmates today operates in 40 U.S. markets and has a fleet of over 25,000 Postmates delivery staff making deliveries on the platform. The company launched an unlimited deliveries subscription plan at the end of March.

Source: TechCrunch

Hands on with the Mophie Charge Force wireless charging system

Hands on with the Mophie Charge Force wireless charging system

In launching its Charge Force wireless charging accessories last month, Mophie wanted to convince you that wireless charging was the future of everything that was lovely and good. On paper, it sounded like the perfect solution for people who can’t stand plugging in their phones every three seconds — but I decided to take a closer look to see whether it lived up to its hype.

At launch, the Charge Force line-up consists of a Juice Pack battery phone case ($99.95), a wireless charging base ($39.95), a car vent charging base ($59.95) and — if you want the full line-up — a desk mount ($59.95). There’s also a $129.95 Juice Pack and base station pack for people wanting to save a little bit of cash.

Decent case, but too big and clunky

Mophie's desk stand looks fantastic, and can be turned sideways for Netflix binging on the move.

Mophie’s desk stand looks fantastic, and can be turned sideways for Netflix binging on the move.

The updated Juice Pack is a great way to extend your phone’s battery life a bit. From a dead battery, it charged my phone up to around 75 percent or so, saving my day more than once. The Juice Pack case supports both the Qi and PMA wireless charging standards, too, so if you find yourself in a coffee shop or a car that has a wireless charging base built in, chances are your case will work with it. Pretty nifty.

Apart from saving my phone from dying, I wasn’t really a fan of the case; the iPhone 6+ is an uncomfortably big phone as it is. The Juice Pack case adds just that little bit too much of width, height and thickness, not to mention weight. In fact, it adds so much weight that the magnets in Mophie’s own products aren’t able to keep the phone comfortably in place.

The case is also too deep in all directions; the headphone socket is so deeply recessed that none of my headphones work without the included adapter. Try as I might, I couldn’t move the “mute” toggle switch with my pudgy little fingers, either: I needed to use a pen to switch mute on or off. That doesn’t really feel like the future we want to be living in.

My final gripe about the case is that it has a Micro USB socket on it. Yeah; they’re easy to find and cheaper than Lightning cables, but I already have Lightning cables all over my house, car and office, so it’s not particularly useful. Also, I do use a few other accessories on a regular basis — such as my Oscium oscilloscope — which means I keep having to take the phone out of its case. Not great.

Going wireless at the office and on the go

Wireless life is pretty sweet.

Wireless life is pretty sweet, you guys. I just wish it was everything I dreamed of.

Gripes aside, I very quickly got used to the wireless lifestyle. The charging pads are smart enough to charge the phone before it starts charging the Juice Pack, so if you make it a habit to leave your phone on a charger pad, you’re always topped up and ready to go.

The desk mount is a nice touch, and being able to grab the phone to take a call or use it as a second screen to monitor, say, Twitter, is nifty. The desk mount also turns to a horizontal orientation, should you want to watch videos on your phone, which is pretty clever. Unfortunately, the desk mount stand wasn’t sturdy enough to keep the heavy phone and case securely in place; it kept sagging down instead of keeping the screen pointed at my face. 

As I normally use a magnetic mount in my car anyway, I had the highest hopes for the car vent mount. Especially if you connect your phone to the car using Bluetooth, being able to quickly throw your phone up on the mount to charge and keep it in place is fantastic, particularly if you do a lot of shorter drives with frequent stops.

Sadly, the magnets in the vent mount weren’t strong enough to keep up with the weight of both the phone and the bulky Juice Pack — I wasn’t able to use it horizontally, like I normally would; the slightest bump in the road, and the phone would move around (and it fell off a couple of times, as well). Not great; flying phones are a safety hazard. I’d be the first to admit that part of the problem might be the rock-hard suspension in my 13-year-old car, but still, it’s not an issue I’ve experienced with my usual magnetic vent mount, so it feels as if the weight of the Juice Pack is to blame.

Used vertically, the vent mount works pretty well, but I never really was able to trust it to not drop my phone off the holder and eventually just gave up on the vent mount: Safety above all, and all that.

The charging pad and battery case kit will set you back $129.95 together.

The charging pad and battery case kit will set you back $129.95.

Wireless is awesome, but…

Overall, I think what the Charge Force review proved to me is that I really love the idea of wireless charging. I wish Mophie would make a case that didn’t have a battery built in — just the wireless charging and magnetic mounting points — as I have a suspicion that this would fix most of the problems I ran into.

Ultimately, it all depends on how you use your phone, so your mileage may vary.

We do need to talk about the price, however — the full set of a Juice Pack with a base station, a desk charger and a car vent charger will set you back a hefty chunk of cash, and I can’t really say it feels worth it. I’m a complete gadget nut and I love the wireless charging, but you can buy an awful lot of Lightning cables and much more powerful battery packs for the same price.

If you can live with the quirks, Mophie has a great set of products here — but I just don’t really see a lot of people shelling out this amount of money to solve the relatively minor problem of plugging in your phone from time to time.

Source: TechCrunch

Finding a film

Finding a film

How Pixar’s ‘Finding Dory’ went from an idea to pixels on a screen.

Have you ever stopped to think about what it takes to bring an animated fish to life? How a team of people can take a spark of an idea and make it a reality so millions of pairs of eyes can stare in amazement at the big screen?

I spoke with Pixar’s president Ed Catmull and one of the tech leads on “Finding Dory,” John Halstead, to find out more about the creative and technical journey behind the brilliant barrage of brightly projected pixels.

Pixar is all about technology. Wait, art. Okay both.

Pixar’s history is deeply rooted in technology, tracing its lineage back to a division within Lucasfilm called the Graphics Group. It was founded the late 1970s and among its first few employees was Ed Catmull who began to research how the company could use computers in filmmaking in general and especially how it would be possible to make a computer-animated film.

Back then, the technology wasn’t ready for making movies, so we started selling our technology instead.

— Pixar President Ed Catmull

It was spun out from Lucasfilm as its own company in the mid-1980s, at which point Steve Jobs invested $10 million into the company. Half of the money went toward the intellectual property rights from George Lucas, and another $5 million served as runway for Pixar. went into Pixar’s bank account to give it some runway.

“Even all those years ago, we all just wanted to make films right from the start,” Catmull told me. He’s still on the company’s payroll, 30 years later, now in the big chair as president. “But back then, the technology wasn’t ready for making movies, so we started selling our technology instead. We were making little short films right away and when the technology was finally good enough, we could start making feature films.”

“Luxo Jr.” was the first animated short Pixar created, and it was released in 1986. The two-minute short film shimmied into history as one of the world’s first computer-animated shorts, and it received a standing ovation on its debut.

The technology it started selling was the Pixar Image Computer and RenderMan. The former was a tremendously advanced computer made especially for data visualization. It only sold 300 units, but was popular for specialist uses in geophysics, medicine, and meteorology.

RenderMan is a software package that calculates how light moves through a computer-generated scene, so the computer can generate images that look “real.” It has had tremendous longevity — the newest version of RenderMan, still developed in-house by Pixar, was used to make “Finding Dory.”

From idea to green light

A movie at Pixar starts with an idea, usually in the minds of one of the Pixar directors. The director comes up with a fraction of a story they want to tell, and it all grows from there. The first phase is called development, where the idea is sown in fertile ground, doused in some creative juice and poked and prodded for a while to see what it grows into.

In development, the idea is subjected to a breadth of talent, trying to flesh out the idea, exploring which way it develops. There may be a little bit of storyboarding, sketching and painting that gives a feel for what the movie will look like, but the plan at this point is to turn the idea into something that can be green-lit and developed into a full-fledged movie.

“The main criteria for us feeling good about green-lighting a movie is that we feel it’s going to be a great movie,” explains John Halstead, the supervising technical director of “Finding Dory.”

Our goal at Pixar is to tell the best story that we possibly can. It is sometimes painful, but ultimately, if it leads to a better movie, it’s worth it.

— John Halstead, supervising technical director on ‘Finding Dory’

Once everyone starts feeling the love for a project, the early pre-production process is wrapped up. There will usually be an idea of what the tone of the film is going to be and what core ideas will get explored, and there will be an outline of a story and maybe some art. This is the point where people like Halstead are attached to the project.

“I represent the technical effort on the film,” Halstead explains modestly. In reality, he is the director’s right-hand person on all things technical, providing direction, vision and resources to all the technical teams on the film, which explains why he gets a shiny title credit front and center as soon as the film ends.

Shaping the story

There is a script of sorts from day one, but at this point in the process it’s not set in stone. From what I’m hearing from some members of the Pixar team, it’s barely even set in rapidly melting soft-serve ice cream.

Script changes are a fact of life in the world of movie-making, but Pixar has a particular reputation for making significant edits in the story even quite late in the production process. That often causes friction among the artists and legendary amounts of work that needs to be (re)done in the process. But ultimately, it’s all done for a reason.

“Our goal at Pixar is to tell the best story that we possibly can. Our entire production process has been centered around that goal, and we to bend over backwards to accommodate that,” says Halstead. “It is sometimes painful, but ultimately, if it leads to a better movie, it’s worth it.”

Once the script, or a portion of it, is given the go-ahead, the real work begins. It’s handed over to the story department, which takes the specifics of that portion of the story to heart, aligns it with the overall feel of the movie and creates a set of storyboards.

The storyboards are like comic book versions of a shot, drawn by story artists for the purpose of pre-visualizing the film. They are placed in sequence by the editorial team, so that they convey scenes and deliver a rough sense of how the story unfolds. It’s a big job, too; more than 100,000 story panels were created just for “Finding Dory.”

From here, the story team considers what each shot would look like, and how they would transition from one scene to the next. There’s also a series of drawings created with various degrees of detail. These drawings are mostly digital paintings made in Photoshop that help visualize what everything will look like. Each sequence gets a set of mood boards and concept drawings, too, to help illustrate the colors, lighting and feel of the scene.

Hello? Is this thing on?

With all of that in place, it’s time to nail down the audio; everything is animated around the voice track, so it’s important to get that right early on. However, big stars like Ellen DeGeneres, Diane Keaton, Idris Elba and Willem Dafoe are very expensive, so for the early stages of the production, Pixar uses “scratch voice” — someone from the production team who voices the character so the animators have something to animate to.

Sometimes, the scratch voice feels so right to the directors that they decide to not replace it with that of a “real” voice actor. When that happens, they keep (or re-record in higher quality) the original voice track for the final versions of the movie; that’s how you sometimes end up with a producer’s assistant receiving actors-style royalties for a Pixar movie they worked on.

Once there is a voice track, the progress is shown to the director. At this point, the visuals are very simple. You have boxes moving around on the screen, perhaps some still drawings, and the scratch audio; it will be barely recognizable as a movie, but it’s enough to start getting some actual, er, direction from the directors. Up to this point, most things in the movie are all in two dimensions — paintings, drawings and so on — but the next step is to start building everything in 3D.

Lights, camera, action

With all of those things ticked off, it’s time for layout to step in. The layout department sets up the cameras and the basic lights and environment. But, you guessed it, given that Pixar creates digital movies, “setting up cameras” and “creating lights” and “prepping the environment” involves clicking mice and pressing keyboard keys, and not a lot of gaffer-taping long coils of cables to the floor for health and safety reasons.

The lighting team will start setting up master lighting — the big light sources in a scene — to start creating the mood. The lighting process is actually tremendously complex. In earlier movies, Pixar could have thousands of light sources in a single scene, and it is possible to use a single additional light source just to get that perfect sparkle in a character’s eye. With the new rendering technology in use on “Finding Dory,” the number of light sources has gone down significantly, which saved a ton of time.

Creating the characters and the environment

I made it half-way through my interviews with Pixar staff before I had a mind-blowing realization: Nothing you see on screen on “Finding Dory” actually exists.

Every hint of a smile playing across a character’s face, every hair, every chip in the paint in the scenery, every fan coral waving gently in the water, hell, every drop of water you see — the minutest detail, every single last pixel you see on the screen — is the result of somebody clicking a mouse, moving a pen across a tablet, or typing merrily away at a keyboard. Yeah, I know. It’s incredible.

With that in mind, it’s the character department’s time to shine; every aspect of every character has to be created, and if you’ve never played with building worlds in 3D on a computer before, you’d be amazed at how much work happens at this stage.

First comes the design, which is usually done in 2D first — sketches in Photoshop — followed by some basic 3D drawings. The most prominent characters also have clay models made of them to help bring them to life and to ensure they look right from all angles.

With a good feeling for what the characters all look like, the general shape of the characters is modeled in a 3D design package, and then they are animated and brought to life by the animation department. Animators create the personality and “acting” of the characters.

Another vital part of the character-making is rigging, which is the creation of a series of variables on a character that enables the animators to do their job. The rigging process is fascinating. If you were rigging a human arm for example, you can define the range of motion that is possible for an arm, a wrist or a finger, but you’d also need to think about what would happen with the skin and muscles around the arm as you move it.

Try it now: Look at your arm as you bend your elbow; your muscles bulge slightly, the skin and fat around the inside of your arm creases and deforms. And if you turn your wrist, you’ll see that your elbow also changes its look. All of this needs to be reflected in the models on screen, and a tremendous amount of time is spent here to ensure that when an animator makes a character come to life with motion and emotion, it continues to look realistic.

The final step of making a character come to life is shading. This is the process of making sure the surface layers of the characters are painted with the right color, texture and other characteristics. For example, a jelly fish might need to be mostly see-through, some fish are shiny, while others are more matte; and various body parts of the characters need different characteristics.

There’s also a layer of simulation that happens after the animation stage. If a character is hairy, for example (Sully, we’re looking at you…), an animator can’t be expected to animate each individual strand of hair. The simulation team and riggers need to work together to give the animator enough tools to be able to do what they need to do. The simulation department adds the “secondary motion” of the fins and tentacles, and this simulation allows the characters to move naturally to complement the acting.

The fascinating thing about animation is how backward the lighting process is. In live action filmmaking, lighting tends to be one of the first things considered in a scene. In animation, it’s one of the last parts of the job. The lighting department uses virtual light sources in the scene to illuminate the characters and the set. Technical directors set up the lighting to draw the audience’s eye to story points and to create the correct mood.

Giving the characters a home

Of course, the characters don’t live in isolation; they also need a rich environment to live in. If you’ve seen the first 10 or so minutes of “A Good Dinosaur” (or even “Piper,” the short that is shown before “Finding Dory” in cinemas), you know what I’m talking about; you never quite know whether you’re looking at a computer animation, or one of the most fantastic drone fly-overs that was ever filmed. Allow me to spoil that one for you: It’s all made on a computer, and that’s precisely what is so incredible.

The world changes all the time, and we have to relentlessly keep trying new things.

— Ed Catmull

Just like the characters, the environments have to be created from scratch: Modeled, shaded and, in many cases, animated to look perfect as the backdrop to the action happening in the foreground. Granted, the sets don’t get the same level of attention — neither from the moviegoers nor the filmmakers — as the characters driving the story forward. The backgrounds and environments are still important, however: If anything detracts from the story being told, or seems out of place, it jars the audience out of the magic.

A culture of innovation

“We have to up the bar,” Halstead says, when I note that I thought the water on ‘The Good Dinosaur” looks better than the water in most live-action movies I’d seen. “On ‘The Good Dinosaur,’ we did a lot of things we’ve never done before, but that is true for every film we create, including ‘Finding Dory.’ The things we learned about water for ‘The Good Dinosaur’ came in handy, but with ‘Finding Dory,’ most of the film takes place in water, and we had to raise the bar again.”

“Innovation has always been part of animation,” Catmull adds, going back to the very beginning of the film genre to make his point: “Don’t forget that when Walt Disney started making movies, it was new technology. It’s easy to forget that when looking back, but he was the person who originally understood that in order to further the artistic side, you have to look to technology.

“All along, the technology has changed,” Catmull says about the last thirty years of animation. “The world changes all the time, and we have to relentlessly keep trying new things.”

I would much rather be proven wrong by my own team than by another studio.

— Ed Catmull

Of course, with trying a lot of new things, you’ll find a lot of things that don’t work, too.

“We are trying to foster a culture where it is okay to fail,” Catmull says. “Most of the time, I don’t even hear about the failures, and if I do, it is because someone is making a big deal out of something that went wrong. That isn’t what we are trying to do.”

I try to dig deeper into whether it is the art that’s driving the technology forward, or whether it’s vice-versa, which is met by a hearty laugh from the Pixar veteran.

“Problems only really occur when people don’t understand it’s a yin yang,” Catmull says, insisting that at Pixar, technology and art are driving each other forward. “We have reached a virtuous cycle now where artists expect new tools to be available to them that can help push their art forward. The tech teams, in turn, thrive because they are being pushed to their limits, too.”

Of course, having a large number of artists and technical people experimenting with new and exciting technology and techniques is all fine and dandy, but ultimately, there’s a deadline, too: Pixar is a business like all others. Films need to get shipped to theaters, money has to be made and mortgages have to be paid.

“They all know there is a deadline. But if someone has an idea, and I think they are wrong, I am not going to stop them,” Catmull considers. “I will fund the project anyway. If it doesn’t work, I would never say ‘I told you so’. It’s counter-intuitive, but here’s the thing: If the team believes they are right, they will work very hard to prove me wrong, and I would much rather be proven wrong by my own team, than by another studio.”

How do you direct bits and bytes around the screen?

“Animators get acting notes, much like directors would give to actors,” says Halstead, giving a series of examples. A director might ask a human actor to show more emotion, to speak with more fervor, to turn slightly or to show conflicting emotion — and the same is true for director’s notes on an animated film. “It is all about getting the characters to perform in a way that tells the story in the best possible way. That is true both for live action and animated films.”

Once the dialogue is recorded by the talent — the famous performers you’ve likely heard of — it gets a lot more expensive to make changes, so it becomes more and more important to solidify the story and the script as the process moves on.

“Calling Ellen back to record some additional lines is expensive, for sure, but the main challenge is scheduling,” says Halstead. “The bigger the star, the busier they are, and we have to be mindful of that.”

A lot has changed since ‘Nemo’

“Finding Nemo” was a huge technological accomplishment in its own right when it came out, but Pixar’s debut fish movie was released over 13 years ago, and a lot has changed in the world of animated film since then.

“’Nemo’ was the first film I ever worked on,” Halstead says, and briefly falls into a memory hole as he relives the days of writing code that would help fish to swim along paths, building the anemones and modeling coral, etc., before eventually moving into more of an effects-driven role, including water effects.

“Everything has changed since ‘Nemo.’ Absolutely everything,” Halstead says, unable to pinpoint exactly what the biggest development is. Software and algorithm changes were enabled by much more powerful hardware, which drove further innovation.

“We benefit not only from 13 years of advancements in the underlying technology, but also from 13 years of production experience, and finding better ways of making great movies.”

In the “Nemo” days, water simulation, for example, was in its infancy, and while the movie does stand the test of time, seeing the difference between the water on “Nemo” and “Dory” is pretty spectacular.

Everything has changed since ‘Nemo.’ Absolutely everything.

— John Halstead

“On ‘Finding Nemo,’ we actually had to scale back the number of shots where we had characters interacting with the surface of the water,” Halstead recalls, “And we had to be very careful with things like breaking waves and other movement. Today, we have the flexibility to do as much of that as we want.”

The computer-generated water splash in the video below illustrates how far we have come in the 13 years since “Nemo” was released. This short video would have been all but impossible back then.

In addition to the new tools Pixar developed themselves, the teams involved took on some new external tools. For lighting and shading, the company started using The Foundry’s Katana for the first time. For water effects, the company relied heavily on SideFX’s Houdini, a procedural tool for animation. “Procedural” in this case means that everything Houdini outputs is the result of math: simulations and equations all the way.

Below is a demo reel for what can be done with Houdini, a software package that’s used extensively in VFX and animated films. It is heavily focused on simulations and makes computers do the heavy lifting to make effects look spectacular.

[embedded content]

Another new technology introduced on Dory is USD — Universal Scene Description. It’s a technology that helps describe and share geometric information between software packages; it keeps all the assets for a movie together all in the same file formats.

It also helps ensure that if you make a change to an asset in one place (say, for example, you tweak the design of a cup in one scene) that the changes are carried through to all other scenes, as well. Pixar uses USD extensively, but is also hoping it will become a broader standard, releasing it as open source later this year.

One of the biggest changes in the past decade, Halstead tells me, is Presto, Pixar’s own home-grown animation software package. It was first used on “Brave” and has been used in all of Pixar’s movies since.

The video below is probably only for the most hard-core animation nerds, but it is a rare glimpse at what Presto is like to use.

[embedded content]

The final big change is that Dory is the first Pixar film that is made with a brand-new version of its RenderMan engine, Renderman RIS. You’d have to be a truly profound film nerd to care about different types of rendering, so I’m going to skip that for the sake of this article, but if you really want to vanish in a deep (if beautifully rendered) hole of visual artistry, check it out.

Now let’s stop for just a second; imagine you’re creating an animated movie. That’s pretty daunting in itself, but also having three different in-house developed software packages to support? That’s just downright mind boggling.

Rendering: It’s like music on vinyl instead of on CD

Of course, at some point the time comes to actually put the final movie “in the can,” ready to be shipped to theaters. Before that happens, you need to make some pretty big decisions about what the film looks like. One of the big challenges the team had was related to Renderman RIS.

“When using path-tracing technology, one of the things you have to deal with is that the rendering algorithm generates a certain amount of noise,” says Halstead, comparing this “noise” to film grain on photographic film. “You can reduce the noise by rendering for longer and eventually the noise gets cleaned up.”

The problem is that this is an exponential process: It might take you just a few seconds to get a picture that’s good enough to get a feeling whether the animation works. Perhaps after an hour it looks pretty good or after a few hours it starts looking very good. But if you’re aiming for perfection, you’ll be running the poor farm of rendering computers ragged for ages.

We could play the vinyl version of the movie, with a little bit of grain, or we can get you the CD.

— John Halstead

So the technical team decided to try an experiment. What if they rendered the same scene with different durations of rendering time, getting various amounts of grain, and then see what actually looked best? Ultimately, the feel of the film was going to be a subjective decision, and obviously, if the director insisted on near perfection, that would potentially mean putting the film’s release date back, or sending someone to the nearest RadioShack to buy another load of servers.

“Our director of photography, Ian Megibben, had a great idea,” Halstead recalls with a smile. “The director loves music. He loves playing music. He loves listening to music. We figured that perhaps the best way to pitch this was to say that we could play the vinyl version of the movie, with a little bit of grain, or we can get him the CD.”

The screening room must have been holding their breaths; if the decision would be to get the film as clean as possible, it would have had tremendous financial and timeline implications. But luckily writer Andrew Stanton chose to embrace a bit of grain, preferring the look of the movie that hadn’t been rendered to computer-enhanced perfection.

The movie that takes 1,800 years to render

Even with a bit of grain, “Finding Dory” wasn’t a lightweight. The average render time on the movie was around 53 hours. On average. For every single frame in the movie, rendered out in the final 2K resolution needed to show the film in cinemas.

“We had some frames that were pretty fast compared to 53 hours,” says Halstead, “but we also had scenes where there was so much going on that it took 10 times more — 500 hours or more to render a single frame.”

In “Finding Dory” there are 24 frames per second, 60 seconds per minute, 102 minutes in the film. And there are two cameras to render, because the film is in 3D. If we’ve done the math right, that means you’re talking about a render time of, oh, 1,800 years, give or take.

Of course, you can parallelize a lot of that; if you are rendering on a high-end, 16-core gaming rig with a ton of RAM, you might be able to get that time down to just a century or so. Eager as we are to see “Dory,” I’m pretty sure few of us would be willing to wait a few hundred years to enjoy the film, so it’s a good thing Pixar has a pretty beefy rendering farm.

More important than rendering out the final film: Imagine you are working on a scene, and you’re having to wait 53 hours to see what the changes you made to the scene actually did. Of course, that’s no sensible way to work, which is why Pixar has an incremental rendering technology.

Yes, the full, final, all-bells-and-whistles render will still take a long time, but you can get a good feel for what it is you’ve just changed much, much faster than that. A version with a lot of noise could be back in the order of seconds or minutes.

Below you can see how progressive rendering works: At first, the picture is basically useless, but it rapidly becomes clearer. Whenever the animator moves the light source, it restarts the rendering from scratch.

Hank, the troublesome septopus

One of the core characters of the movie is Hank, a seven-legged octopus (“That means you’re a septopus,” as Dory points out in the film), which caused some tremendous challenges for the team overall. Think back for just a second to the bit where I explained what a rigger does: putting joints and bones and constraints for movement and all that fun stuff.

Well, if you’ve ever spent any time looking at octopuses (and you should – search for octopus on YouTube or get some Scuba-diving lessons and find one yourself), you’ll quickly realize that “constraints of movement” aren’t really a thing for an octopus. They’ll gladly squeeze through the tiniest holes; their arms can stretch, expand, twist and turn, and they can practically turn their whole bodies inside out.

Now imagine the day at work where someone taps you on the shoulder and says “Hey, Jim, it’s your lucky day. Could you just rig this octopus for me really quickly? That’d be great.”

“We were just kind of scratching our heads,” Halstead says. “We had no idea how to even approach this character; how do you make it appealing from a design standpoint in terms of making it a character people would feel was approachable and likable, fun to watch on the screen and can deliver a performance?”

The team had to figure out even more basic challenges, like where his mouth is and how they would make Hank speak. To take but one example: An octopus’s mouth is in a place that wouldn’t make for a great kid’s movie close-up. But once the character was fully designed and rigged, then came a new generation of challenges. For example, there are scenes where Hank needs to be in a baby’s stroller or move across land. You won’t be surprised to find that Pixar’s animation team was unable to find footage for those particular actions on YouTube that they could use as reference.

About a minute into the “Finding Dory” trailer below, you can see Hank in action.

Half a petabyte of data to make a 100-minute movie

The film has been a long time in the making, but the hard slog seems to have worked out. Pixar’s movies take on average five years to make, an eternity compared to the two-year turnaround you’d expect from your average Bond film. “Finding Dory” netted the company its biggest opening weekend in Pixar history. It is so successful, in fact, that even though it opened just weeks ago, Dory is already snapping at the heels of Toy Story 3, which is the company’s most commercially successful film ever.

“So, could I save the movie on a memory stick?” I ask Halstead, naively, and get a slightly nervous laugh in return. It would have to be one hell of a memory stick.

“The whole film takes up more than 500TB,” Halstead points out. To be fair, that number includes all the final renders, assets, cached data, computer models, digital paintings, etc. — but it’s all data Pixar keeps for reference in the future. Let’s put it this way: If you were to put a backup of all of that data on Blu-Ray discs and you line them up on a shelf, your shelf would have to be as long as a football field. If you’d be dumb enough to try to back it up to CDs instead, you’d need quite a ladder to stack them all: your neatly piled pillar of CD cases would reach half-way to outer space.

But ultimately, a movie isn’t a stack of zeroes and ones, just like a painting isn’t just blobs of pigment on a canvas. Like all art, it’s about what it makes you feel, and that’s where the real magic happens. To me, the most impressive part of “Finding Dory” is that you don’t think about the hundreds of people involved in making it, the render-farms whipped to within an inch of their lives, or the software that brought it all together.

There are a lot of people and a lot of tech that made the movie possible, but it’s all invisible, stepping aside so we can just enjoy the film and keep on swimming.

Source: TechCrunch