Some software feature suggestions


So I did not see a topic for this already so I would go ahead and start one because I have ideas I would like to throw out.

Human interaction is still going to be necessary for farmbot for some time. Some software bits could greatly enhance those interactions that must happen.
My main request is a microphone and small speaker on the control housing. While your hands are dirty from working in the garden for harvest for example, use voice recognition to command the robot. The speaker gives command recognition confirmation, and can play music while you dilly around in the dirt.
So for example, in polycrop different plants will be ready at different times. Farmbot gives you a notification through its phone app that some plants are ready or need human tending. Then when you go out to harvest, the bot “points” to where the mature plant is. You can then dig it up, and while hands still dirty, voice command “next finished plant” or something and it jogs and points to the next. This could also be used for plants that it expects need to be pruned or trimmed for various reasons such as flowering starting in plants that you do not want to flower. It can “point” to everything that needs human interaction and without you touching anything.

You could also integrate a weight scale, and weigh every plant you harvest (and the subroutine will be pointing to that plant as you harvest it) so you can get yield data based on some experimental watering/scheduling/other variables you might be performing. Maybe you are playing around with how much water to give to a particular species. This would give you the ability to quantify your experiment in a catalog-able way.

One of the more critical factors to making this system really great is having good data behind everything. I see OpenFarm is being intergrated, and this is extremely important. Good data on each plant, growing space, synergies, nutrient needs, waste products, all that stuff. It would be kind of cool to have a “simulator” kind of like the game of life that takes in the plant attributes and runs through several life cycles of adding and subtracting nutrients from the soil, insects, random factors, harvests and optimizing the farm layouts automatically based on the best yields from these models. It is outside the scope of where the project is now and is really a whole project in its own, but it would be super cool.


After looking through the forum, reviving this topic to add my thoughts seemed the most logical…

1). The ideas that @AndrewV threw out there (microphone, speaker, music & voice command & control) - I REALLY like these ideas, and at least adding mic & speaker should be light lifting (I will task our students to research this). How great to have music playing while tending the farm(bot)! Adding voice command control might be outside the capabilities of raspberry pi, but maybe not.

2). Adding a “Simulated FarmBot” mode/option to the farm designer app. This would allow for what-if’s, training/learning and maybe even some debugging help.

I hope that re-igniting this thread is the right way to re-kindle the topic in the community.

Thoughts, any & all…?


Hey wow thanks for the revival :slight_smile:

  1. As this was two years ago now, voice recognition has improved quite a bit. This should actually be possible fairly easy with something like Google Assistant, If This Then That, and some custom webhooks. I have actually used it already for another project with success. I used a particle photon but it shouldn’t be too hard to make a custom web service running on the pi, and/or use another broker than particle cloud.
    (Listening activator) (Custom keyword) ( Custom command)
    “Okay Google, Farmbot, show me the next plant”

The speaker portion would be as easy as some cheap speakers plugged into the audio jack. When you have the mic and google assistant, you basically get music control for free.
You could make it super easy and just strap an Alexa Echo to it somewhere out of the rain and make some Alexa skills for the pi.

Or if all this gets annoying talking to your farmbot (is my weird neighbor talking to his plants or his robot?), make a custom tool-head with just a simple button on it that you can press when you finish with the next “human required” task. My main thought was there are tasks the robot can’t do which require a human, but the human will have dirty hands, and must still interact with the machine to find where he must complete the tasks.

  1. The simulator would still be great, but I still think it needs solid data to operate, and the integrated scale to get yield info. We could record massive amounts of data and feed that into sims, and if you want to get buzz-wordy, machine intelligence.
    Farmbot location, weather, plants used, water applied, fertilizer applied, plant neighbors, soil pH… and from that data extract optimizations for all given parameters. Then eventually you don’t even need to plan the bed, or the required water dosing, or required fertilizer. It all becomes pre-calculated and optimized with minimal human intervention.
    You would just say, “make me a garden for two people, and I don’t like broccoli. go.”
    But to do all of this, we need data feedback on the output. And at the moment I think there is still no way for that from farmbot. Easy way might be a hacked kitchen scale used at harvest. Hard way is image recognition of each harvested plant.
    You have the power to aggregate this data as long as people opt in. It would be an extremely valuable tool.
    This is honestly big enough to be a machine learning Phd dissertation.


In order to collaborate with your farmbot I propose to implement a software feature which allows you to move your robot by hand to a specific position and teache this position directly as a movement. This could be done by adding a physical botton on the y carriage which you can press and initialize the teache process. The farmbot would than disable the motor breaks and measure the rotary encoders movements until you stop pressing th button. The new position would be saved in a new sequence. This would be helpfull for teaching by example the universal tool mount position…


I REALLY like the idea of finding a place/way to strap an Echo (or the like) to a location on or about the FarmBot! This would pretty much do it all, leaving some light lifting to send commands to the FarmBot.
Things to think on if the above were to work (in my humble noobie opinion):

  1. Protection from the elements while it still maintains functionality. It would not do well sealed in a box.
  2. security - random person says “OK Google, Make FarmBot water location #1 for the next week”

I did not really think through the “simulator” idea - you have many good points and this seems like it can/could be very complex. I will think more on this.