Free Newspapers, Free Software

Since I moved to New York, I have frequently noticed people giving away free daily newspapers at the entrance to the subway. I never take one - watching the other passers by, it looks like a few people take the newspapers, but that most people pass them up. These papers clearly follow a volume strategy - they try to give copies out to a ton of people, and if only a few percent take the bait, they have pretty good distribution. However, there's a question as to how much value that distribution has. You see a lot of discarded copies strewn all over the place, so I'm going to make a guess that the value provided by such free periodicals is quite low (I can't think of the last time I threw a magazine or newspaper I paid for on the ground after I finished reading it).

Back in the days when I did read such things, I remember seeing mostly low-quality stories pulled from news wires, accompanied by a high density of ads. Basically, these "newspapers" are fake content created as a backdrop for selling ads. Everyone kind of knows this, but there is nominal value created, so it sort of hobbles along. Commuters need something to read on the train, and advertisers need a medium for their ads, so they are willing to pay for exposure. Overall, these content factory newspapers stay in business - at least for now.

But I've actually noticed something else going on in the subways. A significant number of people are fooling around on their iPhones or Android devices, playing Angry Birds or reading content (there isn't any internet access on the New York subway, so web surfing is limited). I have started to use the subway as an opportunity to read books on my Kindle app. So, it seems like there are now many "better" ways to consume content than the free newspapers, and I predict that the free newspapers will lose most of their readers over time (if they aren't already in decline). They have been eclipsed by a better form of advertisement-laden content (remember that Angry Birds shows you ads as well). This article provides some interesting analysis on the effect of smartphones on free dailies.

So, let's move away from the free dailies and look at the bigger picture - the most important point is that a lot of "free" content provides little value, and is quickly discarded. And, if you give your content away for free, people had better be willing to use it for a significant amount of time (or else it will be discarded on the subway station floor before they even get around to reading the ads).

Now we will examine "freemium" software, the recent craze in software that involves building your user base by giving away some version of the product for free. The marginal cost of delivering a single copy of the product is zero (or so proponents of freemium software argue), so you should go for a massive scale by giving away a limited version of your product (or even the full version in some cases). In the short term, you don't worry much about revenue. Once you have enough users, you can unveil paid upgrades, or maybe just flip your company for a bundle.

The problem with this point of view is that there is a difference between having a lot of user accounts and having a lot of people who are willing to pay for your product. At the last company I worked for, we had a lot of users, but it turned out that a relatively small percentage actually used our service regularly. Interestingly, of those active users, the free-to-paid conversion was quite high. But there were a lot of "copies" of our service "discarded on the subway floor," and the actual profits from paying customers weren't enough to sustain the company, even after a couple of years. Those "free" users were relatively expensive to support in the aggregate, as were all of the engineers building the software. In the end we pivoted to another product (which was significantly more successful).

I know of many other companies who have products that a lot of people have downloaded but few people really want. Some of these are currently littering my smartphone. They were probably hot for about five minutes, and maybe they raised a nice-sized round at an attractive valuation, but they are essentially dead in the water if they can't increase user engagement a bit (should have taken the acquisition offer when some big company wanted to buy them for several million).

So, there's a big problem. If your product doesn't actually get users to spend a significant amount of time on it, the lifetime value of a user is going to be pretty low. Whether or not the end-user actually pays you money to use your service, they will pay you with their time if they like it enough (which converts reasonably well to money). The main reason Facebook makes so much money, even with a revenue model that could be significantly improved, is that users are extremely engaged and spend so much time on the site. I find myself coming back to Facebook whenever I hit a lull in my day's activities. I never actually click on the ads, but even if a small percentage of people do so, that's a lot of revenue for Facebook.

Moving back to free software, I pretty strongly believe that if software is worth anything, someone is actually willing to pay you for it. And by "pay for it," I mean pulling out a credit card and giving you actual money on a one-time or monthly recurring basis. If they are only willing to use it for free, then it's likely that they won't be using it for long, and soon it will be cast aside for the next shiny bauble that comes along.

So how do you remedy this if you are developing a product? My business partner and I do this exercise that I call "identifying the customer." This often happens when we are trying to figure out what direction to push product development in, and need to identify who the features are going to target. Basically, we determine who is most likely to give us money for the product, and then figure out how we can better serve those people. As I have pointed out on several occasions, someone can't be the "customer" unless they get enough value from your product that it is worth it for them to pay. The "customer" may not even be the people who uses your product the most, but is the person who gets the most upside from its use. I believe that Google's "customer" is their advertisers, and not the hundreds of millions of people who do searches every day. So, in summary, don't make your product "free." Figure out who is willing to pay for it, and then get them to do just that.

Apple and Me

My first memories of using Apple computers come from elementary school in the late 80s; we would play Lemonade Stand and The Oregon Trail during our recess period (the original Oregon Trail, none of the point-and-click wimpiness that kids these days have). These were Apple IIes with monochrome monitors, not top-of-the-line, but all that a public school could afford. My parents refused to buy a computer for the house. "You already have Nintendo. You don't need another machine to play games on." A few years later, I learned how to program in BASIC on an Apple II, and before long, I could write my own games. My father finally caved and bought a computer in 1989, although he never allowed an Apple product in his house.

Middle school brought my first experience with Macs - we had an LC II with a CD-ROM in the school library. Instead of going out to play during lunch, my friends and I would hang around (monopolize?) the computer, looking up articles on the New Grolier Multimedia Encyclopedia (and occasionally playing a game or two). Kids probably don't even use encyclopedias anymore (other than Wikipedia), but this was revolutionary. First of all, you could fit an entire encyclopedia on a single CD-ROM, while a paper encyclopedia took up an entire wall. But, more importantly, this was a "Multimedia" encyclopedia. If you looked up a famous piece of music, you got to hear a tinny MIDI rendition of the piece (thanks to the sound capabilities provided by Apple). It was all pretty amazing, or so we thought at the time.

By the time I got to High School, we were using Macintosh LCIIIs to publish the school paper. We used a different 3.5" floppy disk for each page - heaven forbid if a disk broke or got corrupted (this only happened once, so far as I can remember). Eventually we moved to using a network mount to store all the pages, and even bought an early digital camera. This seemed pretty high-tech (for 1996). I was still using a PC at home, but at least I had software that let me transfer files between Mac and PC.

This was also the first time I heard about a certain character named Steve Jobs. The school newspaper shared the computer lab with the "Applied Personal Computing" teacher, who was a huge Macintosh fan. He would regale us with Apple trivia and show us his MacWorld magazines; when NeXT was acquired in 1996, I remember hearing the saga of Steve Jobs' triumphant return to Apple. I later learned a lot more about Jobs while reading The Second Coming of Steve Jobs, an unauthorized biography that detailed Jobs' life based on hearsay from people who knew him.

College brought the first Mac that I owned - I started out with a PC, but my friend Jasper quickly convinced me to buy a Mac. I trolled MIT's monthly computer flea market, finally settling on a used PowerMac 7500. After installing a "G3" upgrade card, I began to run it as my primary machine. My time in college coincided with the second golden age of Apple, as some of Jobs' dreams began to come to life. The iMac, the iPod, the Titanium Powerbook, and OS X all made a definite impression on me. Unlike the ugly PCs that I had been using for the past ten years, Macs were refined and pretty. They were well designed and a pleasure to use - truly what a computer was meant to be.

Immediately before I began an internship at Microsoft in 2001, Apple announced the long-awaited white iBook, and I finally pulled the trigger on my first new Macintosh (it was $1499, not including the Airport card). The experience of unboxing it was truly magical - you got to open a meticulously designed package and were greeted by what seemed like the most beautiful machine in the world. It was almost as if Steve Jobs had custom-crafted the experience just for you (and remains to this day). As I brought my iBook in to each day of my internship at Microsoft; I felt like such a little rebel. That machine stayed with me for almost five years, and I was sad to finally sell it.

Back in those days, having a Mac kind of set you apart from the unwashed masses of PC users, who didn't know better than to run whatever Microsoft handed to them. Whenever I met someone else who owned a Mac, a sort of club was formed. We were different, and somehow special, just by virtue of owning the same computer. Apple was exciting, rebellious, and never beige (at least after the late 90s). I always loved Apple's "Think Different" advertisements, which was one of my favorite ad campaigns of all time. I spent quite a while tracking down the Think Different poster featuring Richard Feynman; it still hangs proudly on my wall.

I was actually at the Stanford commencement in 2005 where Steve Jobs gave his now-famous speech. While I had already read and heard much about the creation of the Macintosh, it really added a whole new level of meaning to hear some of it in his own words. Jobs also revealed how he had contracted pancreatic cancer the year before, and how his life had flashed before his eyes. It is interesting to see how a brush with death changes people - for Steve Jobs, it appears to have doubly focused him on completing his life's work. He knew that he probably only had a few more years, so he compressed at least a  lifetime of work into that time.

Building Apple from an underdog into one of the most valuable companies in the world was an impressive achievement, but I think that the iPad will be remembered as his crowning glory (as he would have wanted). With that invention, he fundamentally changed the way that we interact with computers, and began the transition to a post-PC world.

Rest in peace, Steve.

Fighting the Upgrade Compulsion

On the eve of Apple's annual release of the latest and greatest iPhone, I am forced to wonder whether we really need yet another iPhone. Is there really anything wrong with the iPhone 4, despite it being a "geriatric" one year old? I know a significant number of people who will dispose of their one-year-old iPhone and be first in line for the new model, and I have to admit that this worries me.

My mother and I have discussed the topic on many occasions. "Why do they have to change things so often?" she will ask me. "As soon as you buy something, they release a new technology that isn't compatible." Sure you can write these comments off to the older generation not adopting as quickly to new technology, but products are being refreshed at truly a breakneck pace. Product cycles have come faster and faster, even as we slowly reach the end of Moore's law. Yes capabilities have increased, but not as quickly as the avalanche of new products would indicate. So I need to wonder whether there is really any need to come out with whole new product line every year, or whether this is just a way for hardware manufacturers to milk more money out of us?

A Brief Bit of History
It hasn't always been this way. I believe that people used to keep TVs for 10 years or more (hard as that may be to believe). I remember my family's first printer (circa 1991), an original HP Deskjet 500. Despite the fact that it shared a brand name with the current models, it had little else in common with them. The thing was sturdy as a rock, and took up almost as much space as the computer sitting next to it. It also weighed a ton, and was designed to last for many years. I believe that it was still in service when I went to college in 1998. These days, inkjets offer much "higher quality" and "faster performance," yet they rarely seem to last more than two or three years. Build quality is much lower, and after a while they seem to fail in a way that necessitates replacement. You just sort of throw it in the landfill and buy another. This is somehow justified by lower prices - my sister just bought a wireless multifunction printer/scanner/copier for under $100. At that sort of price point, why not throw out the old one every year or so and just get a new one?

Phones have followed a similar trend, but things are probably worse in that space. It used to be that you could keep a phone for four years or more, but that was before the advent of smartphones. I still fondly remember my LG flip phone, which lasted me from 2003-2007. Now, any smartphone that is older than one year is pretty much guaranteed to be obsolete, at least if you want to run the newest applications or the latest version of iOS/Android. Commercials put out by the manufacturers and phone companies now tout "4G" and "dual core," implying that you aren't cool if your phone isn't of the latest technology. This seems ridiculous, considering that even a low-end laptop that you buy today will probably run OS updates for at least 3-4 years (with a few modest upgrades). Low-power electronics are advancing more quickly than PCs (due to the proliferation of multi-core ARM processors), but things aren't advancing as quickly as manufacturers and carriers would imply. Rather, you see a lot more staggered upgrades, with each minor spec bump ushering in a new "generation" of products. For example, most of the "super-fast" 4G phones use the same processors as the 3G models, with the only significant upgrade being the data speed. And some of it is pure marketing - 2010's Nexus S Android Flagship actually had quite similar hardware specs to 2009's Nexus One (the 2011 Nexus phone will have dual-core and 4G, but I wonder how many new use cases this will truly enable).

So Why is This Happening?
Although I will admit to loving Apple products, Apple has been one of the worst offenders, and has helped to drive rampant consumerism. Every year, they release new models (and brilliant ad campaigns) that almost "force" you to upgrade (I know a significant number of people who upgrade all their Apple products the day a new model is released). And it's no wonder that Apple has become the most valuable company in the world - they have transitioned from selling products with 4-year lifecycles to products with 2-year lifecycles. And as Apple touts ever more "environmentally friendly" manufacturing processes, they are simultaneously encouraging you to throw perfectly good electronics into the landfill!!

After some careful thought, I believe that much of the time a purchase decision is triggered by something other than actual need. Cleverly constructed advertising campaigns program our minds to want whatever just came out, even if we don't really need it (or even want it). For example, I somehow "want" one of the new Kindles, even though my current model is only two years old (and works just fine for reading pretty much any eBook). Sure the screen on the "Kindle Touch" refreshes a bit faster, and the new model is smaller and lighter, but I'm sure that a purchase won't measurably change my life (even though Amazon wants me to think that it will). And the new product is so cheap - only $79 (assuming that I am willing to periodically look at "special offers"). A recent study claimed that Apple products activate the same portions of peoples' brains as religious experiences - consumerism has truly become a religion.

The Solution?
So how do we combat this? I'm not going to suggest that we boycott Apple or any other manufacturer - they are just taking advantage of existing market conditions. The hardware manufacturers will continue to upgrade their products at a breakneck pace, so long as there is someone to buy them. And they will continue to obsolete old products as quickly as they can possibly justify doing so. After all, that's how they make money (not by issuing free upgrades for existing models). Maybe we can regulate the advertising campaigns a bit, or force all product commercials to include a disclaimer that discloses the product's environmental impact. However, I'm not sure that this is doable. The ridiculous amount of money that is at stake necessitates that the system be gamed in any way possible.

I think that the most important thing is consumer education. The most important thing we can do is to fight the urge to constantly upgrade, and to change our purchasing habits. If we want to do the environment a favor, we should worry less about environmentally safe packaging, and more about reducing consumption. We should keep our phones and computers for longer, and ask ourselves whether we really "need" that new model every year or two. I managed to keep my last Macbook going for about four years with a few RAM and hard drive upgrades, and I bet that my new model will last just as long. And this is my primary machine, which I use 12 to 14 hours a day for a fairly demanding array of tasks. A smartphone should last at least two to three years. If a manufacturer discontinues upgrades after less than a year, consumers should indicate displeasure with the state of affairs (especially if the product is crippled to prevent third-party updates). If a product fails prematurely, we should do the same. Sure it's easy to blame someone else, but I think that we need to take a bit more responsibility for our consumption habits. And I guess there is no reason my Droid 2 shouldn't last me another year (iPhone 5 be damned).

Managing Your Weight

I will admit that my weight has been a struggle for much of my life. I was a slightly chubby kid, and being kind of shy and awkward, this only made things worse while I was growing up (at least that's how I saw it). Although I lost weight when I hit my growth spurt, by the time I was in college, my weight had started to creep back up (hitting nearly 200 pounds my senior year). During my adult life, my weight has ranged from 165 to about 190 pounds (with the exception of my senior year of college), and I would say that my healthy weight range is 170-175. Since I tend to become consumed by whatever activity I am focused on, it is easy for me to get wrapped up in school or work, and to not make time to eat healthy meals. Over time, this usually results in my gradually gaining weight when I am not actively focused on keeping myself healthy.

In my attempts to slim down, I have tried a pretty good number of the diets out there, including everything from intense exercise to counting nutrients to low carb to "tricking" my brain by drinking flavorless oils. I even tried eating slowly for a while (this actually worked pretty well while I kept it up). Overall, I have lost weight and eventually gained it back at least three or four times in the past ten years. By following the Four Hour Body Diet (from the book of the same name by Tim Ferriss) for the past few months, I managed to get my weight back down to where I like it, but more importantly, I have done a pretty careful analysis of what weight loss tactics do and don't work for me.

So what are the most important things that I have learned?

The Basics
First of all, it's important to eat at least three meals, particularly breakfast and lunch (dinner is occasionally skippable). I have found that when I skip breakfast or lunch, I go into the next meal famished, which typically results in my eating far more calories than I would have otherwise. If I eat a healthy breakfast and lunch, I can often have a small dinner and be satisfied. It is also important to eat your meals at fairly regular times. Make sure you eat breakfast within an hour of waking up, and try to eat lunch no more than four hours after eating breakfast (this is the old "eat before you are hungry" credo). I don't find that eating more than three meals does much to reduce hunger. However, skipping snacks is definitely helpful for me.

Also, exercise alone is not a good way to lose weight. Your body has a finely tuned equilibrium, and it resists changes to that equilibrium. If you up your exercise level significantly without changing your diet, you will become uncontrollably hungry, and will consume enough calories to make up for the additional burn. You may gain muscle and lose a bit off your waistline, but the changes will likely be minor. There has been a bunch of recent scientific research that corroborates this.

Despite the advice of some dieting experts, a calorie is not a calorie. I do know people who have lost weight with strict calorie-restricted diets, but these haven't worked well for me. I just get too hungry when I calorie restrict, and find that this actually causes my body to go into starvation mode. Eventually I crack, and I gain the weight back from the resulting binge. One guy that I knew in San Francisco lost over 100 pounds through a calorie-restricted diet, and seems to have kept it off quite well. So I'm not saying that counting calories can't work, just that it hasn't worked very well for me.

Trick Your Brain to Lose Weight
What I have found is that you need to find a way to trick your brain into letting you lose weight. Your brain is trying to keep your weight constant, so willpower alone probably won't be enough for the long-term (scientific research has actually shown that you "use up" your willpower over time, so willpower probably isn't enough for any sort of long-term effort). There are a number of ways to "trick" your brain. Some people believe that you can alter your "set point" by consuming flavorless calories (read The Shangri La Diet for more information). I tried this diet for a couple of weeks, and found that it didn't work very well. However, you may experience very different results, so it might be worth a try.

I have found the most effective path to losing weight (at least for me) is to eliminate high-density carbs and sugar, and to make sure I get enough healthy protein and vegetables. I have a terrible sweet tooth, but if I can eliminate sugar from my diet, the cravings quickly disappear. If you are willing to count grams of macronutrients (protein, carbs, fat), The Zone Diet works pretty well. If not, you might want to try the Paleo or Four Hour Body Diet (which essentially involve a bit of estimation and a number of fairly simple rules). I personally have found that the sugar content in fruit makes me hungry, so I need to pretty much eliminate it (although one piece a day probably won't kill you).

I am also going to put in a plug for Weight Watchers. It is not my cup of tea, but my mother (who has kept herself thin for her entire adult life with a careful combination of diet and exercise) is a strong proponent. It combines a systematic approach to food selection with a live support group, and for some people this works great. I find that I don't really need the support group, and counting food units isn't really sustainable for me (I need to be able to follow simple rules).

Eat Healthy Foods
I believe that whatever diet you choose, you need to make sure that you are eating healthfully. Eating "healthy foods" isn't enough - you can stay fat if you eat healthy foods in the wrong ratios. However, there are definitely diets that will cause you to lose weight but aren't healthy to your heart or body. The Atkins diet will cause you to lose weight, but some adherents manage to do this while still eating garbage (eggs covered with cheese and surrounded by slabs of bacon come to mind). My mother needed an emergency appendectomy while on the Atkins Diet, and from what I have read, this is more common than you would expect. There is some research that indicates a link between The Atkins diet and colon cancer. Dr. Atkins himself suffered multiple heart attacks while eating his own diet, and while they didn't kill him, I'm guessing they were related to his eating style. A lot of this can be avoiding by limiting saturated fats, and by eating plenty of green vegetables (which can be done even on an Atkins-related diet).

Make Sure to "Cheat"
Also, it is important to cheat every once in a while. If you always adhere to a strict diet with no exceptions, you will eventually slip off the bandwagon. I have found that a weekly cheat day actually makes me want to eat healthfully for the rest of the week (I get lethargic and woozy-headed about mid-day on my cheat day). Then I'm ready to go until some time near the end of the week, at which point I'm only a day or two from my next cheat day. Also, instead of a single cheat day where you go crazy, you could eat one "cheat item" every day (for example, you could have a few fries or a small amount of dessert with your dinner). My friend who lost weight by calorie-restricting didn't include alcohol calories in his tally, although I believe that he did put some limits on his alcohol intake.

Finally, I do believe that it is important to combine diet with some form of exercise. You can go to the gym, or if that isn't your cup of tea, play a sport. If you live in a walking city, you can do this by skipping the subway every once in a while. Sure it may take half an hour to walk to work, but if you do this just a few times every week, you are pretty close to your exercise requirements. I recently bought a FitBit, and use it to track my daily steps.

Also, I haven't really talked about stress and happiness, but I have found that they correlate highly with my weight. During the periods that I have been the least happy and the most stressed, my weight has been the highest (I will admit to being something of a stress eater). Likewise, when I am happy and relaxed, I tend to lose weight (either actively or passively). For some reason, every time I go on a 10-day meditation retreat, I lose at least 5 pounds I'm not sure whether this is caused by the meditation or the healthy, vegetarian diet - probably a combination of the two. In fact, whenever I notice my weight increasing, I start to ask myself whether it is being caused by stress, and typically the answer is yes.

Overall, to keep your weight down, you need to find a sustainable system that works for you. There are a lot of diets and systems out there, and it is important to If you are a vegetarian, going to a meat heavy diet may not be the best solution (but there are plenty of plant-based diets that are supposedly healthy). Likewise, something that requires you to count grams of protein and carbohydrates may be hard if you don't like to cook.

What A Nerd Looks for in a Non-Technical Cofounder

At least once a week, I meet someone who is nontechnical, but who is looking for a "technical cofounder." The problem is that these people never seem to know what makes nerds tick, and what they can offer that will draw the nerd to them (this isn't a one-way relationship, but they are usually looking for someone to build "their" idea, so the attraction primarily comes from their side). I also meet a lot of nerds who are looking for something new, but who don't want to deal with the non-technical aspects of the business, and don't know how to find the right person to work with. So, as someone who kissed a lot of frogs in a quest to find a business partner whose strengths complemented mine, I'm writing this post. Basically, it's a guide to how a nerd thinks, and it tries to help you give him what he is looking for.

The Things That Motivate Nerds
The first thing you need to understand is that nerds think quite differently than non-nerds. In a previous post, I talked about the difference between the way entrepreneurs and engineers think. Understand that the nerd is usually not doing a startup to get rich (although no one minds making money). He just wants the opportunity to build something really cool in an open environment. If he is worth his salt, he can probably get a job making more than you can possibly pay him, so the opportunity to build something cool is by far the most valuable thing you will offer him. It is possible to judge his reaction, and see whether he is excited by the opportunity you are pitching. If not, you may either decide to move on to someone else, or you may think about how you can modify or reframe the opportunity to suit his preferences. Do not try to hard sell him - this rarely works (and even when it does, you will probably have a dissatisfied engineer). 

Be Open To Feedback
So, going along with that theme, a good nontechnical cofounder should be open to feedback on both the product and the business. The hardest people to work with have a totally fixed idea of what the product will be, and aren't willing to change that. They are just looking for others to blindly follow orders. If you are looking for this sort of arrangement, you should probably just hire someone on oDesk to build it for you (and good luck, because you are probably going to fail). Rather, you should think of your engineering cofounder as a valuable and completely different perspective. If he is smart and analytical (which he should be), he may be able to poke holes in your business plan by viewing it through his own lens. Make it clear that you value his input, and try to actually do this rather than giving it lip service.

Understand that Engineering at a Startup Can Be Lonely
Another important trait is that the non-technical cofounder understands how lonely it can be to be an engineer. Even though a significant percentage of engineers I've met are introverts, not all engineers prefer to be by themselves all the time. Being the technical cofounder of a startup usually requires that you spend most of your time holed up coding, and that can get lonely. Remember that this is different from nontechnical work. If you are a solo nontechnical founder, you probably spend most of your days on the phone, on email, and meeting with random people. This is one reason why many engineers prefer to work at big companies - they get to interact with a lot of other people like themselves. In my case, this is the primary reason why I can't do a startup by myself - during the periods I have had to work solo, I have nearly gone crazy.

Befriend Your Cofounder
So I mean this in the best possible way, but it would probably behoove you to attempt to be your technical cofounder's friend (that's going to mean that you actually have to like him). If you guys are going to work together, you are going to be spending a lot of time together. This sounds far-fetched, but you will probably spend as much time with your cofounder as you spend with your girlfriend or wife. He doesn't just want to be your pet nerd who stays locked up in the coding cage while you go about your business. So make him comfortable with you as a person, and you have already won half the battle. It may be as simple as inviting him out for a drink some time early on in your relationship (maybe after the first day you work together). You would be surprised by how many people don't bother to do this, or who wait too long.

Here's a relevant story from my own personal experience. A while back, I was considering working with someone I had met at a social event, and we decided to test things out by going to a technology conference together. Right after we met up, he ditched me for several hours to go to lunch with some guy. I decided not to work with him (based on that as well as other red flags). When I met another potential business partner, we also decided to meet up at a technology conference. He had been invited to go to lunch with some people, and he asked them if I could come along (actually, he more insisted than asked). That made a really good impression, and we are still working together. He has made a lot of attempts to include me, and I really appreciate that (this is a shout out to you, Sam).

Measure and Report Your Progress
I think that it's also important to show your technical cofounder that you able to make regular progress at your tasks. It is relatively easy to evaluate an engineer's progress. Every day (or week), he should be building new features (or improving existing ones). If the product is not steadily improving, then he isn't doing his job. Likewise, you need to show that you are moving ahead at whatever your core competency is. I understand that sales has a lot of exogenous factors that can make progress uneven, but the best sales people can close deals at a fairly regular frequency, and can tell you what the status is of each deal at any point in time. I have spent a lot of time working with non-technical people, and a decent percentage of the time, it wasn't really clear what they were doing. Usually, I made the conclusion that they aren't actually doing anything, but were twiddling their thumbs waiting for me to build the product (you would be surprised to see how often this actually happens). So my advice to you is that you hold yourself to the same standard, and that you figure out some way to measure and report your progress.

Win His Trust
I think that the most important thing that a nerd looks for is someone that he can trust. To be blunt, a lot of non-technical "cofounders" are full of shit and unable to execute (just as many "engineers" are crappy programmers). If you want to distinguish yourself from the pack, you are going to have to earn his trust. You can do that by following some of the above guidelines, and by showing him that you are actually different. Another thing that I should probably talk about is compensation (even though I said that it is unimportant). A lot of nontechnical people undervalue nerds, and feel like they just want to hire a nerd to build out their grand vision. They kind of miss that the technical execution is a very important part of the startup, and that only the person who is building the technical product can understand some of the product implications (more on this later).

As such, the equity compensation for a "technical cofounder" can often be as low as 5-10%. It may seem like you are saving equity for yourself, but you are just screwing everyone. You are looking to give your cofounder a meaningful stake that will make him feel fully invested in the company's success. I'm not saying that you need to give a cofounder who comes on second an equal stake in the company, but remember that most of the hard work is likely still ahead of you, and that what comes around does go around.

Advice For the Other Side
If you are a nerd and are evaluating potential business partners, I would heed some advice given by a friend of mine. Make sure that you test them as much as possible. See how they measure up on each of these characteristics. And if anything feels even slightly off, run as far as you can. If you are worth your salt, you are a hot commodity, and someone else will come along with an offer that merits your participation (even if it takes you a little bit longer than you had hoped).

Kindle Fire: Is and Isn't

The new Kindle device has gotten some rather interesting coverage in the 36 hours since its release. However, much of that press has focused on what the device isn’t. For example, “it isn’t an iPad killer.” Also, “it isn’t running the newest version of Android.” These comments sort of miss the point. Let’s understand exactly what the Kindle Fire is.

A low-cost media consumption device.

It isn’t a “tablet,” let alone an “Android tablet.” In fact, the word “tablet” isn’t mentioned once on the Kindle Fire’s product page. It is a device that allows you to consume content, just like last year’s Kindle. It even costs about the same as last year’s Kindle ($199 vs $189). The only difference is that this one allows you to consume a heck of a lot more types of content.

It is clear that Amazon has been trying to include rich content in the Kindle for quite some time. The problem is that the e-Ink display is poorly suited to active content, so they had to switch to an LCD to enable this for real. Furthermore, the mobile market is so saturated with platforms that it is difficult to get app developers to program on yet another platform. As such, the Amazon App Store was a brilliant move on Amazon’s part. By building an app store on the sly and releasing it before their own device came out, they essentially short-circuited this part of the curve. People can buy a Kindle Fire and play Angry Birds from day one (and everyone knows that’s the only game anyone ever plays).

But it isn’t an iPad Killer
So let’s look at some of the “isn’t”s. The Kindle Fire isn’t an iPad killer. Nor does it need to be. The iPad has sold phenomenally, beating everyone’s expectations and making Apple the most valuable company in the world. However, I don’t actually know that many people who own iPads. $499 is a lot of money to spend on a device that’s not a computer, especially if you already own a laptop. I am the only person in my immediate family who has an iPad. This is not to say that the iPad sales number are invented, just that there are A LOT of people who still don’t own iPads (and don’t plan to buy them).

Amazon found something curious with the Kindle. The more they dropped the price, the more they sold. Almost everyone would want a Kindle if he could afford one. Amazon doesn’t release sales numbers for the Kindle, but they do admit that the Kindle is their bestselling product, and that they are selling more digital books than physical books. I will tell you something else that’s interesting - every member of my immediate family owns a Kindle, and we all use them frequently. The Kindle isn’t as sexy as the iPad, but it is more utilitarian.

So a $199 7-inch tablet won’t compete with the iPad - it will reach entire new markets that the iPad could never enter. The HP Touchpad debacle was just that - a debacle, but it did prove that people will buy anything if the price is low enough (if you don’t believe me, look at some of the crap on Woot, and make sure to read the discussion forum). The $199 Kindle Fire is a much better device than the $99 Touchpad, at least from a software perspective. Amazon’s products pretty much just work (more on that later). Sure, some people will still choose the iPad for its 9.7-inch screen and better application catalog, but the Kindle Fire will work just fine at its intended use case: consuming books, movies, and the occasional round of Angry Birds.

It isn’t Android
So the Kindle Fire isn’t an Android device. And maybe that’s a good thing. I worked for Google for three years, and fully support the concept of Android. However, I happen to own an Android device, which spontaneously reboots at least once a day (it’s a Droid 2). I also know that the Droid 2 as initially released was only somewhat usable, and that the only reason we prayed daily for an update to Gingerbread was because we hoped that it would finally fix the phone (it did - mostly). Every year when a new version of Android comes out, people don’t think “I’m excited about all of the new features.” They think, “maybe this release will make my phone just work. If not, I can always upgrade to the iPhone once my contract runs out.”

So the Kindle Fire is based on Android, but it isn’t Android. That’s actually a good thing - 99% of the people out there won’t care that it isn’t Android so long as they can watch their movies and read their books (and beat up a few deserving pigs from time to time). I’m sure that someone will quickly get stock Android running, but I would bet that it doesn’t work as well as the default OS.

So, in conclusion, I think that Amazon will sell a ton of Kindle Fires, and I kind of wonder how much money they eat on every device they sell. The only thing I personally regret is that Amazon couldn’t somehow release a version with built-in 3G Internet access. And that they released so many new models, which means that I have to choose which one to buy this year…

Knowing When To Quit (and When to Stick It Out)

I think that there are two major ways that many people screw up execution. The first one is not being able to move on from a project that clearly isn't working, while the second is quitting too soon. These mistakes seem fairly obvious from the outside, but people constantly make them, so it's probably worth a brief discussion. It actually seems like a significant percentage of unsuccessful entrepreneurs are currently in the process of committing one mistake or the other. I can even admit to having made both of them in my entrepreneurial career - hopefully you can be self-aware enough to avoid this pitfall.

Not Being Able To Move On
We've seen it all too many times. An entrepreneur is dedicated an idea that clearly isn't working, but he refuses to change directions and work on something else. He thinks that if he can just hold on for a little longer, he can make it successful. Maybe the customer feedback is clearly telling him that it isn't working, or possibly he hasn't been willing to get customer feedback. If you have been working on something for years without seeing any progress, you are probably falling into this trap. I once worked with a guy who had been working on a product for three years. In three years, it had failed to take off, and he refused to work on something even a little bit different. He just kept working on the same bad idea, and it kept not working

I have some bad news for those people who are "just hanging on for a little bit longer." While I admire your tenacity, it may eventually be your downfall. Within three to six months after launch, most of the successful products that I have seen have achieved some sort of traction. Sure, there have been successful teams that worked on a number of ideas before finding one that worked, but once they locked onto the right thing, they quickly achieved some sort of validation.

Why do we see this unwillingness to give up? In psychology, there is a principle called the confirmation bias. People tend to seek out evidence that confirms their existing beliefs, and rarely attempt something that actively negates them. Therefore, it is impossible for them to try anything that will invalidate their idea - it would just hurt the ego too much. If you suspect that you may be falling into this trap, I encourage you to do an experiment that I colloquially call "murdering your baby."

Murdering Your Baby
If you want to murder your baby, you have to be willing to do something extreme that will either prove or disprove your product's core hypothesis (I wrote about this in the past). This often involves doing customer development with unfriendly customers (not friends or family), and paying close attention to whether those customers actually want your product. It's actually not hard to tell whether people want what you are building. They will rarely say "I hate this," but if you pay close attention to how they interact with your product of service, you can determine whether or not it is useful. If people don't care about your product, then it is clearly time to make a change.

So there is an important addendum to this - do it RIGHT NOW. I don't care if your product is ready. I hear so many things along the lines of "people will love this product if I only add one more feature." If you don't add that feature, people will still love the product if all of the other features are dead-on. Part of building an MVP (which you are doing, right?) is figuring out the minimal feature set. I'm not saying that you need to do a full launch for your product before it is ready - just get it out there and show it to real customers. If people truly don't want your product, you are doing yourself a favor by learning that early. You can spend your time building something that they really do want.

Quitting Too Soon
So the counterpoint to this is when people quit too quickly. If you are pivoting every five minutes, you probably haven't given things enough time to shake out. Seriously, once you build a product, you need to put it through validation before deciding whether or not to move on. I have even seen people say "this isn't going to work" and move on, even though customer validation was pretty good and there were no signs that they should be quitting. This trap usually happens when the product prototyping cycle is nearing completion, and it is about time to have real people start using it.

I think that this is an ego thing as well. A lot of founders repeatedly kill products to avoid having to ever face invalidation. If you didn't launch it, then it never actually failed. There is no ego in startups - statistically speaking, you will fail every time. You are doing a startup because you think that you can beat those odds, but most likely you still won't succeed this time. The way to succeed is to release fear of failure, and the only thing that is worse than failure is not showing up in the first place.

Lately I have been hearing the line "this isn't a VC business" quite a lot when associated with killing a product prematurely. Killing your idea because it "isn't a VC business" is rather stupid. Here's the reason - VCs have ABSOLUTELY NO IDEA what a pre-revenue company will eventually amount to. Sure, a lot of pre-revenue companies have stupid ideas, but a lot of successful companies started with stupid ideas, and figured things out later. VCs are just making reasonably blind guesses, especially with pre-revenue companies.

If you can show a bit of success, then the VCs will be happy to invest their money. I know numerous people who worked on ideas that weren't "VC businesses", grew them into profitable companies, and suddenly they managed to get VC investments at rather attractive valuations. This is not to say that plenty of successful companies didn't get started with the help of VCs - just that securing VC investment early on isn't the ultimate predictor of success.

How To Know When It's Time To Quit
Basically, when you have good hard evidence that your product isn't going to work out, then it is perfectly reasonable to move on to something else. However, you need to give an idea enough breathing room to succeed before throwing it out. One rule of thumb that comes to mind is to build an MVP, put it out there, and get it to the point where people are willing to use it. Once you have the product in the hands of real users, you have three to six months to make it succeed. If you haven't gotten even a little bit of traction in that time, you may want to try something else. However, if you have spent much of that time talking to your customers, you will most likely have figured out what they actually want you to build, and the next product can be your big success (and failing that, the one after that).

The Content-Anywhere Revolution

About a month ago, I was hanging out at a bar with some other entrepreneurs. One of them had a 3G iPad, and was using it to check his email (using the iPad Gmail app). I thought about it for a second, and realized that this was the beginning of a new wave.

Sure I can pull out my phone and check my email (or look at a web page), but it isn’t the full-featured, rich experience that I get from my laptop or iPad. Everything is constrained and dumbed down by the small screen size on a phone - the experience can never be equivalent to a PC. Despite the incredible pixel densities we are beginning to see, the eye can only perceive so much detail per square inch. I used to think that 1600x1200 on a 20” screen created unreadably small text - even after the dawn of HD phones (which will happen this year), the amount of information we can display on a 4” display will still be effectively the same.

The original iPad was only a new product in one sense - most of the hardware was at best a mild upgrade from the previous iPhone. But the real genius was the 10” IPS display - for the first time you had a 10” system that you could carry with you anywhere. That was Steve Jobs’ true vision - a touchscreen computer that was always with you. 

Right now, if I want Internet on my Macbook or wifi iPad, I have to pull out my phone, enable tethering, and connect via wifi. It doesn’t take more than a minute or two, but the experience is clunky and slightly awkward. It isn’t instantaneous, which creates barrier to entry. Even though I have my laptop with me at pretty much all times, I rarely whip it out on the spur of the moment.

So the 3G iPad actually contained a second revolution - always-on Internet connectivity. For the first time, you have a truly usable device that can access all data from anywhere. As I alluded to before, always-on Internet is going to be huge.

A few years back, we started to see netbooks with integrated 3G. These were innovative, but most probably ahead of their time (just like the netbooks themselves). The experience of using a netbook sucked enough that integrated 3G couldn’t make up for it. The real revolution comes from an elegant and refined product married to always-on Internet connectivity. The Internet connectivity shouldn’t just be a checkbox - it needs to be an integral part of the product experience.

I actually don’t think that the iPad was the first truly usable device with always-on data - it actually started with Amazon. The original 3G Kindle was brilliant - for the first time you could download a book from virtually anywhere. Maybe you are sitting on the beach, and you want a new book. So you log on to the on-device store, download a book, and within two minutes you can begin reading. It’s seamless - you don’t have to think about “tethering.” I actually think Amazon did people a disservice by releasing a wifi-only Kindle (from a UX perspective), even though I understand the strategic reasons for doing so.

So what’s the next move? I think that we will soon see always-on data in more devices. As data speeds get faster, and all of the US carriers standardize on LTE, we will see high-performance laptops with nice screens and integrated 4G data. I’m looking forward to buying an 11” Macbook Air with always-on data. The only question is how battery capacities will manage to keep up - I’m guessing this is one reason why we soon see most laptops move to multi-core ARM processors.

And then there is always the Amazon tablet that is coming out next week. I don’t predict that it will have 3G (due to cost constraints), but it may be the first high-quality, affordable tablet. Assuming it is successful, we may truly be at the beginning of the post-pc revolution. I would postulate, however, that it isn’t so much about the end of PCs as the beginning of being able to access all the world’s information from virtually anywhere in the world.

Why Nintendo Shouldn't Release on iOS (Yet)

Recently, some investors have been suggesting that Nintendo start to release games on iOS. They look at the huge market share of iOS, and how many games Nintendo would sell if they started to release games for iOS. I think that it would actually be a poor move for Nintendo to release on iOS, both from a financial and a strategic position.
From a financial perspective, they would make less per title. On their consoles, they own the platform, and get a percentage of every game sold. With first-party titles, they make 100% of whatever they sell. On iOS, they only get a percentage of the games that they sell. Sure they might sell more copies, but economically, it would probably turn out worse.

One thing that will hurt Nintendo is that the distribution model for iOS and Nintendo is very different. If you go to an electronics store, there is a Nintendo section. They have lots of games for the Wii and the Gameboy. They have to show something, and it will be sold by Nintendo (even if it isn’t a first-party title). With iOS, Apple controls what shows up. Unless Nintendo is constantly releasing titles, they won’t stay on top. Sure they could release the next Angry Birds or Cut the Rope, but I can imagine that Nintendo will release larger games with a longer development lifecycle. They will need to adapt to a $1.99 price point if they really want to make it on the app store (ok, I’ll give them $4.99).

Another key strategic issue is platform lock-in. By releasing on iOS, they actually encourage people to stop using Nintendo products. Right now, Nintendo fans keep buying Nintendo consoles for first-party titles. Once they have the console, they are locked in, and keep buying more Nintendo games. If customers can get Nintendo games on their iPad, they won’t have any platform lock-in. Their next game could be a non-Nintendo game, and they will have no incentive to buy Nintendo consoles. Customers will buy whatever Apple tells them to buy. I think that the strategic lock-in issues will keep Nintendo from releasing on iOS until they absolutely have to (financials are so poor that they need to do it to survive).

Even if Nintendo releases different games on iOS than they do on their consoles, moving to iOS will still probably kill them. The reason is that the dirty unwashed masses of casual gamers aren’t loyal, and they don’t keep up on new game releases. When I was a kid, I needed EVERY game in my favorite gaming series. As soon as the next Mario title came out, I bought it the day it was available. Casual gamers don’t care about every new game that is released - they just play a game every so often. My sister loves Cut the Rope - last time I saw her she was raving about it. I emailed her the other day to ask whether she had downloaded the new Cut the Rope game, and she didn’t even know it existed. Maybe she will eventually buy it, but she probably had enough from the first game (I don’t think that she even finished it).

It is a sad day when a gaming company goes from making their own consoles to shipping games on other peoples’ platforms. It happened to Sega, and I wouldn’t be surprised if it happens to Nintendo at some point in the future (especially as the niche they have targeted, casual gaming, moves more towards tablets and mobile phones). But, I think that they should go down that path kicking and screaming.

NOT Everyone Should Attend College, BUT Let's not Give up on it Entirely

Recently, a bunch of people (some of them high-profile and well-respected) have been going around trumpeting that "college is a waste of time." I agree with them, but I think that they are missing a lot of the point. I think that saying "college is a waste of time" makes about as much sense as saying that "everyone should go to college." I honestly think that college fills a vital role for some people, but on the flip side, a lot of education is clearly wasted (either unneeded or useless). In order to resolve things, we need to become more educated consumers of higher education, and we need to figure out how to fix the higher education system.

Useless and Unneeded Education
So what is unneeded and useless education? (And I use these as scientific terms) Unneeded education is education that is not required to attain your life goals, but that you may think you need. Some people may need it, but others may be able to get to the same point on their. By contrast, useless education is actually completely worthless, and is just being sold to you by universities who want to make a buck.

If you looked at the earlier engineers at Google, you would see an interesting dichotomy. Most were highly educated, coming from the best schools in the world. Many of the most successful managers had Masters Degrees and PhDs. However, there were a few people (and I mean a FEW) who don't have college degrees. These were actually some of the most impressive people in the place. I remember when I met Aaron Boodman at a party circa 2005 (he has been instrumental in the creation of Google Chrome and several other important initiatives). At some point during our first conversation I asked him what year he graduated from college. For me, college graduation year was just a reference point to determine how old you were (pretty much everyone I knew up to that point had gone to college). His answer was something like "I never bothered to go." He was clearly successful - at that point he was like 27 and had worked for Microsoft and Google. More importantly, he had built a number of successful open-source projects, and was well-known for that stuff. Clearly he had better spent the time that most people spend in college learning how to program (and learning about life on his own dime). However, I believe that Aaron Boodman is an outlier. Most people could not have accomplished what he did with no college education. For him, college was unneeded education.

When I lived in Boston (for grad school), I met a number of people who had gone to college for four years and then had attended two or three years of grad school. They were over $100,000 in debt, and were unlikely to ever get a job that would allow them to pay this debt off. It is even more unfortunate that these people went to school for things like social work and non-profit management. They clearly wanted to do something useful with their lives, but they fell into a trap laid by an educational system that only pretended to be aligned with their interests. This was clearly useless education - they were sold a load of junk. When higher education was cheaper, this wasn't as obvious, but rising costs have made this much clearer.

Confidence and Motivation
So, why would you want to skip college? Clearly this isn't about wasting time. I constantly see people haranguing themselves about how they "wasted this or that year doing this or that thing." But if they hadn't wasted this or that year, they wouldn't be in the position to know that they had missed it. Clearly, that experience gave them valuable experience, either in the form or knowledge or confidence. When I was 18 years old, I was super-immature. I definitely wasn't ready to be out in the world on my own at age 18. College gave me an important buffer of time to grow and mature, and I think it fills that role for many people. I think we just need to figure out what purpose it serves, and make sure that it does that well.

There are two things you need to succeed in life (yup, only two) - confidence and motivation. If you have those two, you can attain all of the other things that matter. Not everyone will "succeed" to the same degree, or in the same ways, but heck, we are all different.

First let's talk about motivation. Motivation is what you need to get things done. In the context of learning, it allows you to sit down and study without being told to. You can get knowledge by reading a book or by attending a class. You could put some people in a room with a stack of books, come back a few years later, and they would have the equivalent of a college degree. Other people just don't have that capability. They would probably have spent those years watching TV. They need the structure of a university to obtain that knowledge. It might be possible to come up with a way to motivate those people, but I think that some of them do just fine in a structured university setting (someone needs to work for the companies founded by the self-motivated dropouts).

Now confidence - confidence is the ability to know that you CAN get things done. Some people have that naturally, but others need a bit more coaching. When I came out of MIT at the age of 22, I realized that I could do whatever I wanted so long as I was sure enough of myself. In truth, I didn't need the fancy degree to succeed at my first few jobs (although it didn't hurt when I was interviewing for them). Most of what I used at work was actually learned on the spot, except possibly the realization that I COULD learn just about anything. But I don't know whether I personally could have come to that realization without a college degree from a top institution. I think I got a lot more from it than just that one realization, but that realization was a big part of it. By comparison, my grad school experience at MIT was much less of an AHA moment (since I already had the aha years earlier). I just needed the confidence to quit my high-paying job and go out and start a company, and going to grad school gave me the excuse of sorts.

Let's Not Throw the Baby out with the Bath Water
I agree that we should encourage students to delay entering college, but I think that we need to focus on fixing our educational system rather than abandoning it. I agree with the unCollege people, but I don't think that they have the whole answer (just a part of it). For some reason, most of the people I knew who delayed college by one year seemed better adjusted than the people who went straight to college. A lot of my sister's classmates at Brown came in a year or two late and seemed better-adjusted, while a lot of my classmates at MIT went to college a year early, and seemed particularly immature and poorly adjusted. 

I think that we should encourage kids to delay college by one year. During that year, they should be able to pursue something they are passionate about, or if they aren't self-motivated enough to come up with something, they should be provided with service opportunities that allow them to have a more structured experience. After one year, a lot of them will run (not walk) to college. Some will realize that they can make due on their own, but I honestly think that will be fewer people than you would expect. Hopefully the kids who do go to college at 19 or 20 will be far better customers of higher education than the naive ones who now come in at 17 or 18.

Let's Actually FIX Higher Education
So let's talk about how to fix higher education. There are obviously lots of theories about how things are broken, but I haven't seen that many ideas for how to fix things. I think that the key to ending the higher education bubble is rationalizing the value equation.

First of all, every university should be required to tell each attendee up front how many years it will take them to pay for his or her degree (think of it like the nutritional info they put on the box). The University should be required to attach a page to each admissions letter that says "this degree will cost you X dollars fully loaded. Based on data from recent graduates of our program, you can expect to make Y when you graduate. It will take you Z years to pay off this degree at a cost of Q dollars per month." Every university has this info for each of their programs - it helps them figure out who to hit up for donations after graduation. Maybe they can put it all together, and tell you how much "nutritional content" each degree has. If a degree doesn't have enough nutritional content, they can either reduce the cost, or they can alter the educational content to improve graduates' skillsets (optimally there would be a combination of the two).

I think that accreditation should be based on nutritional content. Programs that don't meet a certain benchmark shouldn't be allowed to subsist alongside better programs. College Freshmen don't realize that the choice between an English major and an Engineering major can drastically affect the quality of their lives in some cases (although if you look at the prospects of English majors coming out of certain Universities, they do just fine). You would think this might encourage a shift from liberal arts educations to science and and engineering-based ones, but I'm sure that all programs can manage to tune either the content or the pricing model..

For example, there is extremely limited demand for German Professors at our universities. There are only a few positions open for grad students, and most of those don't get jobs when they graduate. And even if you get one of the few jobs as a German professor, the salaries aren't that good. But no one tells you that when you declare a German major. So maybe universities will decide that the tuition is significantly reduced for undergraduate German majors, but they will only accept 3 students per year. If they want to have 50 German majors who pay full price, those students will either need to take a curriculum that also prepares them for other jobs, or the university will put a big red flag on the program that says "THIS ISN'T WORTH IT."

Overall, I think that we can get there. But I think that there are more responsible things we can do than pay people $100,000 to drop out of college. It's sensationalist, and it gets headlines, but it doesn't really address the larger problems that we actually want to solve (although maybe it will start to point us in the right direction).