I'm an entrepreneur, most recently with Lore. Thinking about what's next.

  1. The Decline of the Mobile Web →

    Chris Dixon articulates the danger of an app-dominated internet, where big companies like Apple and Google regulate innovation: 

    People are spending more time on mobile vs desktop.

    This is a worrisome trend for the web. Mobile is the future. What wins mobile, wins the Internet. Right now, apps are winning and the web is losing.

    Moreover, there are signs that it will only get worse. Ask any web company and they will tell you that they value app users more than web users. This is why you see so many popups and banners on mobile websites that try to get you to download apps. It is also why so many mobile websites are broken. Resources are going to app development over web development. As the mobile web UX further deteriorates, the momentum toward apps will only increase.

    The likely end state is the web becomes a niche product used for things like 1) trying a service before you download the app, 2) consuming long tail content (e.g. link to a niche blog from Twitter or Facebook feed).

    This will hurt long-term innovation from a number of reasons:

    1) Apps have a rich-get-richer dynamic that favors the status quo over new innovations. Popular apps get home screen placement, get used more, get ranked higher in app stores, make more money, can pay more for distribution, etc. The end state will probably be like cable TV – a few dominant channels/apps that sit on users’ home screens and everything else relegated to lower tiers or irrelevance.

    2) Apps are heavily controlled by the dominant app stores owners, Apple and Google. Google and Apple control what apps are allowed to exist, how apps are built, what apps get promoted, and charge a 30% tax on revenues.

    Most worrisome: they reject entire classes of apps without stated reasons or allowing for recourse (e.g. Apple has rejected all apps related to Bitcoin). The open architecture of the web led to an incredible era of experimentation. Many startups are controversial when they are first founded. What if AOL or some other central gatekeeper had controlled the web, and developers had to ask permission to create Google, Youtube, eBay, Paypal, Wikipedia, Twitter, Facebook, etc. Sadly, this is where we’re headed on mobile.

    Chris is right that if the current trend continues, the internet will be a less interesting place. Not only does an App Store-world preclude business innovation, it limits individual expression, the lifeblood of the internet.

    The problem with the current model extends beyond big companies playing gatekeeper roles. Publishing an app is much harder than publishing a website. It takes more dollars, more know-how. And apps live in silos, not in an interlinked network like the web, limiting content diversity. 

    That said, we can’t ignore the superior user experience that native apps bring. Facebook’s mobile app is much better than their website. You don’t have to log on. There are no page loads. It feels fluid and simple and alive. It’s always on your homescreen, so you’ll never forget to check it. These are killer features. 

    But an internet that looks like a mall—with 10 or so major brands—is a depressing future. That future isn’t inevitable. A core interest of mine has been rethinking what the web, and the internet, should look like in this new era of computing. 


  2. World in Transition →

    Thoughtful and compelling talk by Albert Wenger on the digital revolution and it’s impact on the future. 

    He argues that every so often, a combination of exponential technological and societal advancements bring about a new “Age”. The Agrarian Age came through farming innovation and animal domestication. The Industrial Age came with the advent of  machinery and new manufacturing processes.

    The Information Age, he says, comes from the ubiquity of computing, networking, and other manifestations of these technologies: artificial intelligence, 3d printing, synthetic biology.

    Why now? "It took a long time for society to change after the things had been invented," he says. 

    While the far digital future is almost inevitably optimistic, he argues, the transition periods of technological change have historically been extremely difficult—wars, hardships, revolutions. 

    Wenger’s goal here is to bring forth the most pressing questions about this future. What’s the future of human fulfillment, jobs, privacy, and governance? 

    The video is well worth 15 minutes.  


  3. Teaching Machines to Think →

    My friend James Somers wrote an excellent profile for last week’s Atlantic:

    “It depends on what you mean by artificial intelligence.” Douglas Hofstadter is in a grocery store in Bloomington, Indiana, picking out salad ingredients. “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done.”

    Hofstadter strongly disagrees with the current mainstream approach to artificial intelligence. While AI leaders like Google use massive amounts of data and brute force computing to understand what you mean—nothing like a human’s thought process—Hofstadter argues that the real value AI is in building a machine that thinks like a person. This, he says, will teach us about ourselves.

    He’s been working on this alternate vision of AI for decades, but he’s missing almost all measures of success—academic acclaim, industry adoption, intellectual import.

    “Ars longa, vita brevis,” Hofstadter likes to say. “I just figure that life is short. I work, I don’t try to publicize. I don’t try to fight.”

    There’s an analogy he made for me once. Einstein, he said, had come up with the light-quantum hypothesis in 1905. But nobody accepted it until 1923. “Not a soul,” Hofstadter says. “Einstein was completely alone in his belief in the existence of light as particles—for 18 years.

    The piece reminded me of a recent trip to the Computing History Museum in Mountain View. The museum makes it clear that while we sometimes think evolutions in technology are inevitable, they’re very much the products of who works on them, with what support, when. History usually highlights the winners and forgets the losers. but there’s as much (if not more) to learn from the guys toiling away in their attics.


  4. Dawn of Autonomous Corporations →

    Autonomous corporations will be a new breed of corporations that act and behave, for all practical purposes, just like regular corporations. However, no one ‘owns’ them. Not the creator, not the customers, not the governments, no one really. Sound familiar?

    Bitcoin can be thought of as the first real autonomous ‘corporation’ although you probably don’t see it that way. Think about it – it provides a payment protocol and employs miners to maintain that protocol. The employs are rewarded with ‘stock’ that is split at most into 21 million units. You don’t have to think of Bitcoin this way to get to autonomous corporations, though it will help.

    The idea is the same – this corporation has revenues, expenditures and profits. However, once again, no one owns this entity, it owns itself. The reason it exists is to provide a service at an extremely competitive price that no human-based corporation can provide, so they’ll work higher up the chain to provide ‘value-added’ services.

    Fascinating, thought provoking post. Though I’d say these types of organizations are less “corporations” and more networks. Because under this logic, isn’t the web, or any open source project, a “corporation”? The better way to think of them is as networks with no owners, distributed inputs, and distributed outputs. 


  5. Dolly →

    In a great series called the “Retro Report,” the Times highlights the story of Dolly, the first known mammal clone who lived beyond birth:

    Some events are just too emotive to be seen clearly until long after the dust has settled. The cloning of Dolly the sheep created a public ruckus because of the assumption that sheep clones would lead straight to human clones. The generation of human embryonic stem cells raised much the same set of fears. Both cases stirred deep anxieties that science was getting out of hand and moving deep into ethically fraught territory.

    The included video is great. It shows how politics and celebrity can get in the way of world-changing science. It’s amazing to watch how silly important people reacted to a huge breakthrough in our understanding of biology. Smart politicians and leaders should use this as an example of the dangers of trying to ‘reign in’ progress. 


  6. The Next Standard

    If you want to know where mainstream computing is going, look to the iPhone 5S. Historically, what Apple puts in its top-of-the-line iPhone carries through the rest of its product line (including their iPads & Macs) and, more importantly, the rest of the computing industry. Retina displays, Siri, new cameras—these are all innovations that started in the flagship iPhone.

    So there’s a good chance we’ll see fingerprint authentication, dedicated motion sensors, slow motion video, and 64-bit processing in most phones, tablets, and even PCs. The interesting question is when.

    How long does it take for top-end iPhone features to trickle down?  It’s a question that really matters for developers. Because as cool as fingerprint scanning is, if it’s limited to 5S customers, it’s irrelevant. 

    TAGGED: writing tech

  7. Apple.com, Year 2000

    I found some cool Apple product videos from the early 2000s. Here are the highlights:

    Read More

    TAGGED: tech apple startups writing

  8. Fast Company: The Ingenious Business Model Behind Coursekit, A Tumblr For Higher Education →

    Coursekit:

    Cool article about us in Fast Company

    Blackboard, and other LMS, are like the BlackBerry—they rely on wholesale adoption by large organizations, much as the PDA was once approved by corporations and issued en masse to their employees for free or at a discount. Coursekit is more like the iPhone: designed to appeal directly to the end consumer. In this case, Coursekit is betting that individual professors will find it more streamlined and easier to use than the reviled Blackboard. They piloted with profs at 30 campuses this fall, including Stanford, and currently have students serving as evangelists at 82 campuses.

    Like this part too:

    When you look at Coursekit as a potential Facebook or LinkedIn for education, it’s not just a piece of the $500 million LMS market they’re gunning for; it’s a chunk of the $500 billion higher education market. Online institutions could operate entirely through the site; brick and mortars could use it to enhance recruitment, retention, and student services.

    (Source: loreblog)


  9. Be Our Fall Intern →

    hunterhorsley:

    Coursekit Team

    It all started with me carrying a hulking 6ft-wide desk from Ikea to the apartment in the NYC Financial District that doubled as the Coursekit HQ. Since then, my internship for Coursekit has been everything I’d hoped.

    We’re hiring an entrepreneurial intern. Hunter, who started as an intern but is now a key part of the team, wrote a great post about his experience

    Here’s the job description. Reach out if you’re interested: jobs+intern@coursekit.com. 

    (Source: hunterhorsley)


  10. On Hiring

    A big part of my job is recruiting, yet we haven’t hired a single engineer at Coursekit. The cry you hear from entrepreneurs, that hiring is impossible, is true. But only partially. 

    You see, we’re looking to hire extraordinary people. The brightest in the world. People who can help us build a company. People who want to devote themselves to something world-changing. They’re motivated, brilliant, and they share our passion. 

    I don’t think there’s a shortage of computer science graduates. Hiring the best people, by definition, will always be a challenge.

    When we speak to a candidate, we want to be blown away. I don’t care about your experience or your degrees. Have you created something incredible? How hungry are you? Would I want to work for you? 

    We’re trying to extend class beyond the lecture hall, to change what education looks like online, to make classes about people again. We want to power every class, school, student, and educator, while building the largest academic network.

    We won’t let the bar drop.

    It’s tough, but I’m confident that we’ll build the team we need. 

    If this speaks to you, and you’re looking to make a difference, email me at joseph (at) coursekit.com. 

    TAGGED: hiring startups entrepreneurship tech writing

  11. Steve

    Tonight, for the first time, I feel that strange, deep sadness for someone I’ve never met. It’s the sadness people feel when their favorite musician passes away, when a captivating president dies. It is a unique sorrow, of both selfishness and altruism. 

    Steve Jobs is not dead. But he has famously said that there is no Apple Steve Jobs and Steve Jobs the Person. They are one. Apple was as much in Steve’s DNA as he was in the company’s. He would only leave Apple if he was leaving this earth altogether. His life is his work.

    I was probably around 8 years old when I discovered Steve Jobs. Steve answered the question of what I want to do with my life. It all made sense. We’re here to make a difference. To create something extraordinary.

    I’m part of the Apple religion. I love their products, yes, but more importantly, I try to look at the world in “The Apple Way.”

    Having a Mac in 2000 was weird, as a third-grader. Everyone had PCs, and gradeschoolers want to be like everyone else. But my dad persisted. “We’re a Mac family,” he said. And then I fell in love with the company. It represented the things that I wanted to be: different, creative, smart. 

    Steve defined that vision. The press speaks of him as a tech mogul and the man who reinvented the music business. But I bet Steve doesn’t think of himself that way. It’s not about the money. It’s about building amazing things and changing the world.

    I’ve dreamt of meeting him, but that doesn’t seem likely. Whatever wisdom of his is available online has inspired and guided me.

    While his story as the leader of Apple is certainly his biography, his influence extends far wider. He personifies startup culture and a generation of innovation. The idea that one person can start something extraordinary. The idea that the entrepreneur is an artist. 

    Whatever happens, he will live on in the millions he’s inspired. Thanks, Steve.

    TAGGED: steve jobs apple tech entrepreneurship writing

  12. Invest in People - #GroupMe →

    My friend David Tisch:

    What I liked about Jared and Steve was that they were insanely confident yet humbly aware of the challenges, all at the same time.  They knew what the product would look like, but more importantly, they knew what the product would feel like.  Their intensity was palpable and you wanted to be around them. And they channeled this energy and vision into building a unique culture both in and around their company.

    That is why, over the past year they’ve built a world-class team, built a world-class product, and built an amazing syndicate of investors and advisors. It’s been so satisfying being part of it. Congrats to Brandon, Steve, Pat, Jeremy, Cameron, Tara, Ajay,the dude with the long hair, and the rest of the GroupMe team. You worked incredibly hard and built a winning product culture in the heart of NYC. The engineering talent and product culture at GroupMe was a major differentiator in a crowded space.

    Great post. Congrats on your first exit, Dave.