In 2016 we participated in the NASA Spaceappschallenge. Our team wanted to grow cress as automated as possible.
Our mission statement was:
setup a demonstrator green-house
autonomous farming, through machine learning
add a gaming part for users to help nuturing the plants
We didn't implement the last part. But the first two were successful.
On the first spaceappschallenge weekend we build our first planting box:
On the following weekends we improved the first box a lot. We used a raspberry pi camera to shot a photo from above every five minutes:
After a few months we started to build a second and later a third box. Every iteration was better because we learned from previous versions.
The boxes are build by cutting IKEA Samla boxes. They are cheap and easy to operate on.
Every box had a DHT22 for temperature and humidity inside the box and one outside the box. After the first weeks we added fans to exchange air between the outside and the inside of the box. This was very important against mold on the plants. We used very cheap pumps that can be operated by a raspberry pi. The timing was tricky because every you cannot control the amount of water - only the time the pump is powered.
We measured the moisture of the soil in the first iteration by very cheap metal sensors that measure conductivity between them. Later we used capacitive sensors additionally.
To get the best possible photos we enabled a small 12V LED before taking the photo and disabled the LED afterwards.
Image of the setup from above:
In the final iterations (the last three months) we experimented with light - more like the absence of light. The cress is still growing without any sunlight, but tastes different and is more yellowish than green.
The Raspberry PIs pushed every photo and every sensor value to a rest api. This website was build with Python in the Django webframework.
The cress.space website showed an image from every day of the plants growing. Here the last days of a cycle growing white clover:
The code of the website is archived on https://github.com/aerospaceresearch/cress-website.
On the Raspberry PIs most of the code was either in Bash or in Python. The code is also on Github: https://github.com/aerospaceresearch/cress-meta.
For the MRMCD 2017 we trained a machine learning model to predict if we should water the plants based on the camera images. The system was a binary classifier using CNN layers trained on the images we took. We used Tensorflow when it was pretty new.
The talk at MRMCD in German:
The code on github: https://github.com/mfa/cress-classify. The data isn't online anymore. If you want to experiment with it, send me an email.
About 230 gowing cycles with cress, phacelia or white clover. The raspberry pi cameras shot over 900.000 photos - most of them of the growing field inside the boxes.
All our growing cycles with plant, a score of success and some statistics is in https://raw.githubusercontent.com/mfa/cress-classify/master/experiments.org.
In the 2.5 years we saw a lot of things failing: two much water, no water, mold, sensor fails and a fair amount of human error. But we learned a lot about growing plants, Rapberry PIs, water and electronics.