This is a very cool article, thanks for writing it.
What I find cool about it is that most setups I’ve seen using either DAT or IPFS usually rely on P2P protocols as an afterthought. The sites are usually hosted in a standard HTTP webserver while also being shared and pinned in P2P networks. I love that the setup described there eliminates the need to maintain a web server without necessarily introducing vendor lock-in. If the author wanted they could migrate to a self-hosted or third-party IPFS gateway and pinning service. The more we explore alternative decentralized protocols for real day-to-day usage like blogging, the better those platforms become. I’m excited to try a similar setup in the future.
Wow. I honestly hadn’t thought about it like that. The big motivator for me was the fact that sourcehut doesn’t have static website hosting so I needed a way to bypass running my own server. I just took it as an opportunity to try something new.
Also it’s worth mentioning most of the work is done by ipfs-deploy. If not for a “zero config” tool I might’ve not taken the afternoon to set up this deployment strategy and gone with an alternative.
I recently tried the major new 1.0beta1 release of Beaker Browser of Dat (now Hypercore) network, and was super impressed by its user friendliness. I mean, in an early-internet/geocities style, to be clear. It lets me start writing my own html immediately in the browser and serve it. I have a Raspberry Pi and I definitively plan to use it as my own “pinning” device soon, though there’s also a public website called hashbase that lets you pin up to 100 MB for free. Though there’s no Cloudflare-level public gateway for it as of now; that was certainly a strike of luck for IPFS. Personally I keep fingers crossed for all of IPFS, Hypercore/Dat, as well as Secure Scuttlebutt. I’d love to finally bootstrap a blog and have it shared on all 3 of those platforms xD plus ye olde http as well :)
On the point about org-mode, would using the Org export framework fit into your workflow? Some time ago I developed the ox-haunt package for exporting articles to the Haunt metadata + markdown format.
This is incredible! I’ve been wanting to figure out how to use IPFS to reliably host a static website accessible from the normal web. While this might not be as production ready as AWS or digital ocean CDN, it looks perfect for a personal website/blog.
I’m also impressed by SXML! I might have to use haunt for static site generation!
I doubt I’ll be using guix, though, since I’m on macOS. I do wish there were a similar package manager for macOS, but I suppose homebrew is usually sufficient to get the job done.
Both. A personal website has been on my todo list for years and since I’m learning guile it felt like a good side project. I chose to deploy using IPFS because –as explained in the first paragraph– it’s shiny and I did not want to manage a server.
Unless I’m missing something, I didn’t connect to your site via IPFS, so isn’t there some (HTTP) server you’re running? Also, managing a HTTP server is pretty trivial if you’re already hosting static content.
Well you are missing something: the part of the article where I talk about using cloudflare-ipfs to access my content through http.
And with regards to managing an http server I’d rather someone else do it. Between cloudflare and pinata I doubt I’ll ever have to troubleshoot downtime unless I let my domain name expire.
This is a very cool article, thanks for writing it.
What I find cool about it is that most setups I’ve seen using either DAT or IPFS usually rely on P2P protocols as an afterthought. The sites are usually hosted in a standard HTTP webserver while also being shared and pinned in P2P networks. I love that the setup described there eliminates the need to maintain a web server without necessarily introducing vendor lock-in. If the author wanted they could migrate to a self-hosted or third-party IPFS gateway and pinning service. The more we explore alternative decentralized protocols for real day-to-day usage like blogging, the better those platforms become. I’m excited to try a similar setup in the future.
Wow. I honestly hadn’t thought about it like that. The big motivator for me was the fact that sourcehut doesn’t have static website hosting so I needed a way to bypass running my own server. I just took it as an opportunity to try something new.
Also it’s worth mentioning most of the work is done by ipfs-deploy. If not for a “zero config” tool I might’ve not taken the afternoon to set up this deployment strategy and gone with an alternative.
I recently tried the major new 1.0beta1 release of Beaker Browser of Dat (now Hypercore) network, and was super impressed by its user friendliness. I mean, in an early-internet/geocities style, to be clear. It lets me start writing my own html immediately in the browser and serve it. I have a Raspberry Pi and I definitively plan to use it as my own “pinning” device soon, though there’s also a public website called hashbase that lets you pin up to 100 MB for free. Though there’s no Cloudflare-level public gateway for it as of now; that was certainly a strike of luck for IPFS. Personally I keep fingers crossed for all of IPFS, Hypercore/Dat, as well as Secure Scuttlebutt. I’d love to finally bootstrap a blog and have it shared on all 3 of those platforms xD plus ye olde http as well :)
Regarding SSB, you might like https://github.com/noffle/ssb-webify
On the point about org-mode, would using the Org export framework fit into your workflow? Some time ago I developed the ox-haunt package for exporting articles to the Haunt metadata + markdown format.
That absolutely looks like something I can use.
I was just going to write the org files and export them to markdown with pandoc. I’ll look more into that package after work.
This is incredible! I’ve been wanting to figure out how to use IPFS to reliably host a static website accessible from the normal web. While this might not be as production ready as AWS or digital ocean CDN, it looks perfect for a personal website/blog.
I’m also impressed by SXML! I might have to use haunt for static site generation!
I doubt I’ll be using guix, though, since I’m on macOS. I do wish there were a similar package manager for macOS, but I suppose homebrew is usually sufficient to get the job done.
The Nix package manager totally works on macOS.
Ahh, nice, I didn’t know that. Might have to give it a shot.
Nix is a similar package manager that can be used with macOS but I’m not sure if haunt is in nixpkgs.
Just for fun? Or for some benefit?
Both. A personal website has been on my todo list for years and since I’m learning guile it felt like a good side project. I chose to deploy using IPFS because –as explained in the first paragraph– it’s shiny and I did not want to manage a server.
Unless I’m missing something, I didn’t connect to your site via IPFS, so isn’t there some (HTTP) server you’re running? Also, managing a HTTP server is pretty trivial if you’re already hosting static content.
Cloudflare IPFS gateway as noted in the article:
https://cloudflare-ipfs.com
IPFS here is a replacement for Git{Hub|Lab} pages. Author does not want to run a webserver.
Well you are missing something: the part of the article where I talk about using cloudflare-ipfs to access my content through http.
And with regards to managing an http server I’d rather someone else do it. Between cloudflare and pinata I doubt I’ll ever have to troubleshoot downtime unless I let my domain name expire.
Oops, I misunderstood that point. nvm then.