Docs offline available?

Is it possible to easily download the docs for offline use? I’ve looked for a API endpoint that would allow me to download the page-data in some form or another, but that appears to be a dead end. I can always try to build a scraper of some sorts, but I thought I’d ask here first.

I know that -even if it exists- such a dump would probably not include code, models, videos and other non-text input, but those are all rather easily downloaded.

Reason I ask, is because at the moment I am not able to build a FarmBot (I don’t have a garden), but in the future I very much wish to build one. Naturally I hope that FarmBot will keep in business until that time, but such things sadly don’t always go as hoped for. I’m trying to generate a dump for if that happens.

I understand that using the live version has a lot of benefits; building this archive is just a precaution.

1 Like

To add to this: It would be great if we could git clone the documentation sources, and download latest versions via PDF. :slight_smile:

Haha well yeah, something like that would be optimal, though having the raw files (I don’t know what structure uses - I guess json or markdown) would be more useful, when automatically scraping YouTube for the videos and whatnot.

Unless all the assets (like models, images and videos) would be in the repo as well. That would be most convenient, but I’m assuming that would require a major overhaul on their end. :confused:

1 Like

@jebba @TheYsconator

It’s a coincidence that you mention this because it just came up at FarmBot internally this week.

It isn’t, and we’re really disappointed by the fact that our current vendor will probably never support that feature. As a result, we’re in the planning phase of getting off their platform.

Totally agree. I just set up a new 1.5 bot this week and I really really wanted to take my kindle out into the garden for quick documentation reference, but the current platform does not support MOBI, EPUB, PDF etc… When we get off of the current platform, this is going to be a big decision factor for us.

It will be a huge overhaul, but after some internal discussions, we’ve decided that we will need to deal with the upfront costs in order to have more control over our documentation. We are probably going to use a static site generator of some sort and keep most assets in version control. Although as hinted at, the videos will probably have to stay on YouTube. We’ve had trouble in the past finding a video host that can support our level of traffic for a reasonable cost.


@RickCarlino Cool, good to hear. I’m a bit LaTeX fan too, but that may be pushing it. :wink:

For videos, perhaps one of the peer-to-peer networks. I don’t know much about it, but I’ve been seeing Bitchute used more and more. You can auto push from Youtube to Bitchute too. There’s LBRY too, but probably not as common.

Since we’re on the topic… It would be good to be able to download the CAD files as files instead of having to go to a site and use a CAD application to extract them. Just a .zip or a git clone of the files would be most swell. Some CAD files are just in Google drives, which is kind of meh too. I realize there’s no quick/easy answer to this.

I’ve manually scraped everything I can to mirror it in git repos for my own use. Those mirrors, fwiw:

1 Like

@jebba It will certainly be in a “hacker friendly” format, though probably Markdown :wink:

There are probably some easy ways for us to provide offline copies of the support videos, but it’s not a request we’ve seen come up until recently. I will ping @roryaronson about this.

1 Like

Oooh it’s good to see that you are looking into this! Having the docs in one easily-downloadable place would be a huge improvement for me: it would basically allow me to download all the (digital) stuff I need to build a FarmBot later in life.

For anyone who wants to scrape the YouTube videos: there is an application called jDownloader2 that can help with that (simply paste a YouTube-playlist-URL and it does the rest). As always: do your own research before downloading/using/installing a piece of software.

Writing a scraper for the 3D models shouldn’t be too difficult either. For now I’ll wait and see where the devs go, but you can always poke me if it would be useful to have a scraper for those.

Again; thanks! (I, in all oddness, have been looking forward to this ‘feature’ for quite some time, so I’m a happy man. :grin: )