Documentation for using the web app -- setup

I need some basic documentation for the web app. Here are the things that would be helpful to get started:

  1. Controls->Move: Define those button’s. At least with a tool tip. Just from their layout its not clear which is the related to each axis. From fiddling around, it seems that the < and > are for x. The ^ and v that are between the “x”'s controls y and the ^ and V that are to the right of the x controls is for z. At this time, I have no idea what the “home” button is for.

  2. Controls->Peripherals. What’s it for?

  3. Device -> Farmware. Nothing is shown there. Is that a problem?

  4. Device -> Device. Auto updates doesn’t seem to work. The camera drop down says “None” but I have a camera on the bot?

  5. Device -> Weed detector. Help has all of two words -> “Detect Weeds”. I have no idea what to do with this section. It does, however show some images from my farmbot.

  6. Device -> Hardware. This seems fairly important but I have no idea how to “calibrate” any of the axis. What are endstops and should they be enabled? Should the encoders be enabled? I’m pretty sure nothing interesting will be possible till I get the bot calibrated and homed. I operate CNC machines all the time. Most of my machines have hall effect or optical homing switches. The machines power up and get homed to establish “machine” coordinates, then your off to running g-code in program coordinates after you set a ‘work offset’ to allow translating from machine to work coordinates. There doesn’t seem to be any theory of operation or description about how the coordinate spaces are supposed to work or how one goes about homing a system that doesn’t have any stops.

  7. Tools -> Tools. I assume you want every customer to enter the basic tools that you ship with every robot since that list is empty? Or is this for “other” tools that didn’t come with the system?

  8. Tools -> Toolbay1. Seems like you need to add tools to the toolbar, but none are found in the dropdown. Also, you need to somehow figure out the x, y and z location of each tool? How do you configure the second tool bay?

1 Like

Thanks for the feedback. We have software documentation for the web app here. We’ll work on making those points more clear, and we plan on adding a navigation link to the documentation in the web app soon.

I spent a lot of time looking over that linked information BEFORE posting my questions. I also clicked the little “?” help icon on the web application itself on each of those forms. The web app itself has more information that the linked document. Most of the pages that I referenced above are a) inconsistent with the current webapp or b) a simple screen shot of the page with a bulleted list of what’s on the page. None of the details that I’ve requested are on either the web app or the online documentation site. I also looked for documentation elsewhere via google searches to no end.

For example, this page: is the total documentation online for the device tab. The “calibration” section is simply a bulleted list of the words that are on the left hand column of the screen with no additional information as shown in this screen shot:

Pressing the “?” icon on the web app gives you this paragraph:

“Change settings of your FarmBot hardware with the fields below. Caution: Changing these settings to extreme values can cause hardware malfunction. Make sure to test any new settings before letting your FarmBot use them unsupervised. Tip: Recalibrate FarmBot after changing settings and test a few sequences to verify that everything works as expected. Note: Currently not all settings can be changed.”

That’s not enough information to carry out a calibration process.

I’ve added descriptions for the settings. As with the web app, the documentation is under continual improvement. Thanks for pointing this out.

Ok, I’ve read over what you wrote. I don’t think I understand this part:

"Enable endstops
If using end-stops instead of rotary encoders, enable them here.

Enable encoders
If using rotary encoders, enable them here.

Firmware-level support for rotary encoders is still under development.

Elsewhere it says you either have to have “Endstops” or “Encoders” in order to calibrate the system. Are there “endstops” included with the kit? Are those just mechanical stops or are they sensors of some kind? I don’t think the latter.

I’m just trying to figure out how I’m going to calibrate the system, set the home and then move on to defining the tool positions.

Under device->device, for the camera this is what it says:

“Camera selection
Select the type of camera you are using in the camera selection dropdown. Choices are USB Camera and Raspberry Pi Camera. Defaults to USB camera. Test by using the Take Photo button in the Weed Detector widget.”

The web app does not allow you to select the camera type and simply says “none” on the drop down. Is this something that is also under development?

I see that you now have the “weed detection” says “Work in Progress” so I guess that doesn’t quite work yet? Is there a list or map of what actually works vs not?

I still don’t see how to create the second tool bay nor do I see definitions for the control buttons.

I’ve added a note in the documentation for manually calibrating the device until automatic support is fully implemented. Endstops are not included in the kit, since rotary encoders will perform all calibration and homing functions as well as keep track of position and detect stalls.

If you click into the text of the camera selection dropdown where it displays None, the camera options will appear. Optionally, you can start typing USB in the field and the USB option should appear. However, if you are using the camera included in the kit, this shouldn’t be necessary since a USB camera is used by default.

You may take and view photos in the Weed Detector widget.

I’ve added a note to the documentation regarding multiple toolbays; multiple virtual toolbays are coming soon. You should have no trouble entering all of your tools into the the single toolbay in the web app, since the toolbay widget accepts unique coordinates for each tool.

On the camera, I was clicking on the drop down “arrow” and nothing would change. I tried clicking in the text per your suggestion and that works.

bgmoon, How were you able to calibrate your Farmbot?

How do you manually calibrate the Farmbot?

I’ve not managed it. I frankly think that the whole calibration process and assumptions will need a rethink for the product.

As I stated, I’ve operated CNC machines for decades – machines that cost $200 and ones that cost over $100K. Both hobby class and profession ones. There are certain principals that even the cheapest hobby units utilize that seem to be missing from the system.

Homing AND encoders are really mandatory for these machines to work on a daily basis. Even with encoders, this will not be sufficient. Motors slip, stuff will get on the tracks, tracks wear and power goes out. None of these situations will be handled by encoders alone. And right now, we don’t even have encoders.

For example, when power goes out and the robot is moving (guaranteed to happen on almost every system out there) then there is no way of know how far the bot “moved” when power was going down. Then, power comes back on, where is the bot? Encoders will absolutely not tell you where the bot is unless they are absolute value encoders which they are not.

Without Homing on a power cycle, you will never know your actual location. The operator of the bot may not even realize that power was lost at some point in a 24/7 product like this. How often has power gone out overnight at your house? Image you are sleeping, power goes off in the middle of the night and a regime starts up when the bot powers back up? Will it damage the system, destroy your plants, think your plants are weeds because of loss of calibration, etc. ?

In short, ALL these CNC machines are going to need homing and encoders to effectively operate 24/7 with limited supervision.

Right now, with the current state of the software and lack of “endstops”, the system is not usable. PERIOD. Each time the robot power cycles it re-homes to its current location, setting it as the 0, 0, 0 point. At that point, everything from tool positions, to the locations of your plants are completely wrong.

I tested this last night. I power cycled my machine and when it came back up it re-homed to my current position :frowning:

Please don’t take this as being super negative or anything, its just a realistic understanding of the state of the system at this point in time.


5 posts were split to a new topic: Recommended software for home CNC machine

Hi all,

Thank you for the feedback! We’re working 'round the clock to get features built and things stabilized across the app experience. The top priority for us is encoder support at the firmware level and the other hardware parameters such as “Invert Motors”, etc. We understand this is a pain point for everyone in getting set up and reliably using their devices and it has admittedly taken longer to get this working than we anticipated. Up until this point we have been “manually calibrating” our devices by simply moving them by hand to (0,0,0) and then booting up. If anything ever went wrong (lost position for some reason) we just manually moved back to (0,0,0) again and rebooted.

Once the closed-loop feedback control is working though, then calibration will be more or less automated. You’ll just click the “Calibrate” buttons and FarmBot will find its end locations, measure its overall size, and move to the Home position. As Billy mentioned: this will need to happen automatically too whenever the bot loses power/reboots, and manually whenever a user reconfigures/adjusts their hardware.

Encoder support is very close for us. Our main firmware contributor, Tim, has been working on it between his day job and taking care of his family, and he just got his production FarmBot device setup this weekend to do the proper testing needed before deploying code to all the bots.

Please keep the feedback on the documentation coming. This is how we improve it for everyone. If you have things you would like to add, there is also a “Suggest Edits” button that you can use to help us out. We review all suggestions and have merged in quite a bit to the hardware hub from the community.

Also remember that because all of this is open-source, anyone can help us with the software if they have features they want developed. Here is where everything is hosted: You can also “Watch” the specific repositories so that you get email updates on the day-to-day progress.


The ability to share completed sequences and regimens would be helpful.

@bgmoon have you been able to configure any tools in the toolbays? I’ve found that the movement and coordinate system of the bot is extremely unreliable.

Sometimes an axis will move for no reason. Other times it appears that the movement of the bot isn’t accurate at all. It’s almost unusable in my experience.

Honestly I’ve put the project on hold till they have the encoders working. I’m fairly certain that without complete and accurate feedback that things will just get to frustrating .

I’ve actually taken this week off work to work on my own robots :slight_smile: (