IPFS isn’t exactly a well-known technology yet, even among many in the Valley, but it’s quickly spreading by word of mouth among folks in the open-source community. Many are excited by its potential to greatly improve file transfer and streaming speeds across the Internet.

From my personal perspective, however, it’s actually much more important than that. IPFS eliminates the need for websites to have a central origin server, making it perhaps our best chance to entirely re-architect the Internet — before its own internal contradictions unravel it from within.

How, and why? The answer requires a bit of background.

Why We Have A Slow, Fragile And Forgetful Web

IPFS is a new peer-to-peer hypermedia protocol that aims to supplement, or possibly even replace, the Hypertext Transfer Protocol that rules the web now. Here’s the problem with HTTP: When you go to a website today, your browser has to be directly connected to the computers that are serving that website, even if their servers are far away and the transfer process eats up a lot of bandwidth.

Data providers get charged because each network has a peering agreement, while each network hop costs money to the data provider and wastes bandwidth. Worse, HTTP downloads a file from a single computer at a time, instead of getting pieces from multiple computers simultaneously.

Consequently, we have what we’re stuck with now: a slow, expensive Internet, made even more costly by predatory last-mile carriers (in the U.S. at least), and the accelerating growth of connection requests from mobile devices. It’s not just slow and expensive, it’s unreliable. If one link in an HTTP transfer cuts out for whatever reason, the whole transfer breaks. (Whenever a web page or media file is slow to load, a problem with a link in the HTTP chain is among the likeliest culprits.)

Remaking The Internet With IPFS

The InterPlanetary File System — a tribute to J.C.R. Licklider’s vision for an “intergalactic” Internet — is the brainchild of Juan Benet, who moved to the U.S. from Mexico as a teen, earned a computer science degree at Stanford, started a company acquired by Yahoo! in 2013 and, last year at Y Combinator, founded Protocol Labs, which now drives the IPFS project and its modest aim of replacing protocols that have seemed like facts of life for the last 20 years.

As a peer-to-peer distributed file system that seeks to connect all computing devices with the same system of files, IPFS seeks to improve on HTTP in several ways. Two, Juan told me in a recent conversation, are key:

“We use content-addressing so content can be decoupled from origin servers, and instead, can be stored permanently. This means content can be stored and served very close to the user, perhaps even from a computer in the same room. Content-addressing allows us to verify the data too, because other hosts may be untrusted. And once the user’s device has the content, it can be cached indefinitely.”

IPFS also addresses security problems that plague our HTTP-based Internet: Content-addressing and content-signing protect IPFS-based sites, making DDoS attacks impossible. And to help mitigate the damage of discontinued websites, IPFS also archives important public-record content, and can easily store important, public-record content.

IPFS would help the Internet grow into the system we’ve always aspired it to be.
IPFS’s final core improvement is decentralized distribution, which makes it possible to access Internet content despite sporadic Internet service or even while offline: “We make websites and web apps have no central origin server,” Juan explained. “They can be distributed just like the Bitcoin network is distributed.” This is actually something that HTTP simply cannot do, and would especially be a boon to networks without top-notch connectivity (i.e., the whole developing world), and for access outside of metropolitan areas.

Released in Alpha last February, IPFS has already started to see a lot of experimentation among early adopters. On September 8, for instance, Neocities became the first major site to implement IPFS, following a call from the Internet Archive for a distributed web. We currently suffer a constant loss of websites as their owners abandon them over the years — a growing crisis to our collective Internet memory — and this is a small but important step toward a more permanent web.

But will websites owned by large corporations follow Neocities’ lead, adopting such an as-yet-untested protocol — especially when the mere mention of “peer to peer” often terrifies them? That takes me to my final point.

Why IPFS Matters For The Future Of Internet Business

As I explain in my upcoming book, we are fast approaching a point where the cost of delivering content will outstrip the benefits — and profits. The major Internet companies are already struggling to stay ahead of our content demands, with armies of engineers at companies like Akamai, Google and Amazon devoted to this one problem.

And they haven’t even seen the worst of it: Thanks to rapid adoption of low-cost smartphones, whole continents of consumers will go online in the coming decade. The Internet of Things promises to only compound this challenge, as billions of devices add their own demands on our rapidly dwindling connectivity.

We are already in desperate need for a hedge against what I call micro-singularities, in which a viral event can suddenly transfix billions of Internet users, threatening to choke the entire system in the process. (A potentially life-threatening outage, when the micro-singularity involves a natural disaster or other emergency.)

Netflix recently started researching large-scale peer-to-peer technology for streaming, an early, hopeful sign that companies of its size and reach are looking for smarter content distribution methods. Netflix, YouTube, all the bandwidth-heavy services we cherish now would thrive on an Internet remade by IPFS, dramatically reducing the cost and time to serve content.

Beyond improved service, IPFS would help the Internet grow into the system we’ve always aspired it to be at our most idealistic, but cannot become with our current protocols: Truly capable of connecting everyone around the world (even offline) to a permanent but constantly evolving expression of who we are, and aspire to be.

View this post on HackerNews.