An IoT project: Monitoring soil moisture (Phase 3) – Adding Watering Capability (Part 2)

In the previous articles, we covered the implementation of a water pump station as a separate module and did some discovery works on adding WiFi capability for the Soil Moisture Monitoring system. They can be found here and here respectively.

In this article, we will continue to look at integrating the water pumping system with the main system and hook up the pipes to the pots.

Building an irrigation system

Before we start connecting the water pump station to the main system, we need to first build the pipe network that will ensure water is delivered to the right place.

At first, I had the idea of using both silicone tubing and Lego for this aspect. The silicone tubes would serve as the water delivery medium while Lego will be used to build the emitters and be installed around the pots.

For the Lego water emitters, I’ll admit I didn’t sketch out to guide me in the construction process. Just like any creative endeavours that I partake, which includes writing, my process is akin to a “gardener” or “pantser”. You can read up more on what these means here. Put it simply, I have some rough ideas in my head, and then set out to explore and experiment trying to achieve a desired outcome.

But I digress.

So, I had the bricks laid out in front of me and I went exploring. It took me several tries before I settled on a final design that I’m happy with.

The idea behind the above design is that the silicone tube will go into the orange cone-shaped bricks. The good thing about those bricks is that the tube could stay relatively secured. This emitter will be attached at a height. Once water exit the tube, it will follow the slope down onto the soil below.

However, my experiment with trying to get water to flow down the slope failed. Either the water jets out the when there’s a slight water pressure or the water bunched up together and roll off the slope as a globe.

I gave up on using Lego to build the water emitter and went out to a hardware store to get actual water emitters. At the store, I could only find 3-way coupling and 0-6l/h line-end dripper by Claber in stock. I got both. In hindsight, the dripper is actually redundant.

Once home, I built a ring-shaped irrigation system and then testing it out.

I used the dripper first to build the irrigation system, which you can see in the picture above sitting on top of the bowl. During testing I realise it was the wrong component. The water pressure from the pump was too much and water was bursting out any openings it found. I had to shut the system immediately because there were electronics around me. In hindsight, I could have reduce the speed of the pump to maybe a fraction of 100% and it might have worked.

Anyway, the next day I switched out the dripper for the coupling and switched to another plastic container since we do eat from the bowl. Not very hygienic to use it for testing.

This time, it works a lot better. Even at 100% pump speed, water didn’t burst out at the seams and dripped into the container gracefully.

And, this is where the integration with the main system comes in. The remaining part of the article will focus mostly on the software aspect of the integration.

Water Pump Integration

Previously, we were able to get the water pump station to work with an Arduino Uno. We figured out how much water the pump could move for a given time in second. This give us the confidence to start deploying it into the real world.

First, we will connect the Grove cable to the I2C on the Arduino shield before we could do anything else.

Connect the Arduino to the computer and load the original soil moisture monitoring Sketch into the Arduino IDE.

We will get the Sketch ready to support the motor driver.

Add the following #include directive.

#include "Grove_I2C_Motor_Driver.h"

And the following #define directive.

#define I2C_ADDRESS 0x0F

Then, we add the following to the setup function after Oled.begin().

 Motor.begin(I2C_ADDRESS);

The following is how the setup function should look like.

void setup()
{
  Serial.begin(115200); //Open serial over USB

  pinMode(sensorVccPin, OUTPUT); //Enable D#7 with power
  digitalWrite(sensorVccPin, LOW); //Set D#7 pin to LOW to cut off power

  pinMode(sensor2VccPin, OUTPUT); //Enable D#8 with power
  digitalWrite(sensor2VccPin, LOW); //Set D#8 pin to LOW to cut off power

  pinMode(alarmVccPin, OUTPUT); //Enable D#5 with power
  noTone(alarmVccPin); //Set the buzzer voltage low and make no noise

  pinMode(button, INPUT);

  if (Oled.begin())
  {
    Oled.setFlipMode(true);
    Oled.setFont(u8x8_font_chroma48medium8_r);
  }
  else
  {
    Serial.print("Fail to initialise OLED");
  }
  Motor.begin(I2C_ADDRESS);
}

After the above, the Sketch is now ready to trigger the pumps when the conditions are right. We will first add a few global variables and functions to handle the water pumping process and stopping the pump.

For global variables, we will need the following declared at the top of the Sketch together with the rest. Note: Motor and pump are used interchangeably in this project because the pump is just a special type of motor.

bool motorRunning = false; //Keep track if the pump is running.
unsigned long motorRunTime = 0; //Keep track of when the pump starts running.
unsigned long motorToRun = 1000 * 5; //5 seconds for a more meaningful test run

The following function determines if the pumps should activate.

For now, since we are just testing, we will just go with a simple check to see if moisture level is below certain value. If it is, then pump water. Given what we know about the soil moisture sensor, dipping it into a bowl of water will give us a reading of more than 800. Taking it out of water will cause the following function to run.

void waterPlantIfNeeded()
{
    if (moistureLevels[1] < 800)
    {
        Serial.println("Pumping water now...");
        pumpWater();        
    }
}

The following function handles the actual water pumping process. The reason for running motor 2/pump 2 for testing is because it is the pump that I have connected the irrigation system to.

void pumpWater()
{
    if (!motorRunning)
    {
        Serial.println("Running pump 2");
        motorRunTime = millis();
        Motor.speed(MOTOR2, 100);
        motorRunning = true;
    }
}

The following function stops the pump when it has run for more than the specified period.

void stopPumpingWater()
{
    if (motorRunning && millis() > motorRunTime + motorToRun)
    {
        Serial.println("Stop pumping water");
        Motor.stop(MOTOR2);
        motorRunning = false;
    }   
}

Then, in the loop function, we added the call to the waterPlantIfNeeded() function in the if statement that does the sensor reading as shown in the screenshot below.

After that, we added the call to stopPumpingWater() at the end of the loop function.

And this is time for testing.

However, the testing didn’t go so well. The motor driver will fail to initialise quite often. That is even after multiple restart. And when it did finally run, the OLED display will fail to run. That was when I’m reminded of the bug in the firmware of the motor driver I mentioned previously.

It was time for firmware update.

Updating Motor Driver Firmware

I came prepared for this step. I ordered and got the Pololu AVR Programmer v2.1 a few days ago that could be used for updating the firmware.

So, I went ahead to take apart the housing to gain access to the motor driver. I hooked up the motor driver to the programmer and connect the programmer to the computer. For the programming of new firmware, we can use avrdude.

On the mac, you can download and install avrdude via homebrew.

Then, do connect the motor driver to a separate power source too since the AVR programmer does not supply sufficient power. I made the mistake of not connecting the motor driver to a power supply and waste half an hour trying to figure out why I couldn’t flash the new firmware in.

You can download the latest firmware for the motor driver here at the official repository on Github.

You should also download and install the Pololu AVR Programmer v2 app onto your computer as before we could flash the firmware on the motor driver, we will need to know which port we are using. The app could tell us that.

Once you have the firmware downloaded, avrdude installed and the AVR programmer is running properly, open terminal and navigate to the folder where the firmware is residing.

Then, use the Arduino IDE to find out what are the ports available on the AVR programmer that you can use.

In my case, the port I have to use for firmware flashing is /dev/cu.usbmodem003268172 as it is the programming port identified in the AVR Programmer app.

To flash the firmware, use following command. Do supply the programming port to use before pressing the ‘Return’ key.

avrdude -p m8 -c avrisp2 -P <the programming port to use> -U flash:w:./mega8motor_v4.hex

If the command is accepted, you should see something in the terminal that is similar to the screenshot below.

Now that the firmware is updated, we could continue our testing and do the rest of the implementation…

More troubleshooting…

It turns out updating the firmware on the motor driver isn’t enough even though the Arduino is able to detect the motor driver and trigger the pumps. After the motor driver is initialised, the Oled display refuses to work. Even though the Oled display initialised correctly without any error, any commands sent to it didn’t yield anything.

Below a screenshot of one of my attempts to investigate what’s going on.

I thought initialising the Wire library manually would help. The Wire library is basically an Arduino library that allow us to work with I2C connections/devices.

There were times when it works. I could use the Oled display after the motor driver is initialised. The first time it worked, I thought everything was fine.

Subsequently, the Oled display continued to fail to work and I spent half a day trying to understand why.

A chance Google search led me to this page on displaying data on oled that is part of the Beginner kit for Arduino.

At the end of that section was a Breakout guide. It describes what to do if the Oled display cutout from the kit.

With that, I went searching for the library on my computer and making the change to U8x8lib.cpp

Then, to test that the Oled works this time, I added a function called displayMessage(String message) to handle the display of text regarding the state of the Sketch. In the setup function, I added calls to this function with the relevant text.

After that, I recompiled the Sketch and upload it to the Arduino. Once the deployment is completed, I kept a close watch on the Oled display. Soon, it became clear that it worked. The Oled display was showing me the text I set when calling the displayMessage function.

A further test of pressing the button I implemented previously to turn on the display revealed that it is indeed properly now. And I also know that the motor driver was initialised properly because I heard the pumps doing test spins as implemented in the setup function.

With that, it would appear that a bug in the U8x8 display library gave us so much woe.

Going Live!

With the Oled display and motor driver working now, it was time to go live with the automatic water pumping. If you are wondering about why the tubes are not in a water bottle/container, it’s because the picture was taken shortly after the successful test run. I didn’t want any water involved just yet.

What’s next?

Even though the water pumping station is integrated, it is not the end of this project yet. We still need to implement the WiFi connectivity and this will be covered in another article soon.

We also need a web-based dashboard and some other quality of life improvements such as buttons or switches to turn on/off specific pumps while the rest of the system is running or an API for us to manually trigger the water pumping via the internet.

If you would like to have a fuller picture of what I did, the source code to the Sketch can before found here.

An IoT project: Monitoring soil moisture (Phase 3) – Adding Watering Capability (Part 1)

In the previous articles, we covered the development of a soil moisture sensing system and the addition of a WiFi module so that the soil moisture information could be sent out. Phase 1 of the project is covered here while Phase 2 (Part 1) is covered here. If you are wondering why there’s no part 2 of Phase 2, it’s taking longer than I anticipated to implement whereas part 1 of Phase 3 is much easier to implement.

And as mentioned in our first article, we will be adding plant watering capabilities in Phase 3 of the project. We will call this the water pumping station.

To do this phase of the project, we will be needing a few additional pieces of hardware to build this station.

Below is the list:

  1. 2x 12v peristaltic pump (also known as Dosing pump)
  2. Grove I2C Motor Driver
  3. Wires – Solid core 22AWG and Stranded (11 to 16 strands)
  4. 12v DC power supply
  5. Stripboard
  6. Soldering kit
  7. 12v barrel connector plugs (5.5mm x 2.1 mm female and male version)

Now that we have the hardware established, let’s get started.

Setting up the power module

This water pumping station will be connected to the existing soil moisture monitoring system. Given that the Arduino Uno accepts a 12v supply through its barrel connector, we could use the same power source as the water pumping station.

To achieve that, we could use a combination of stripboard, wires and barrel connector plugs. Sadly, there were no pictures of how I develop the power module other than the final picture when it’s already installed in the housing. I had to take apart the housing just to take a picture of it.

As you can see, two different type of barrel connector are hooked up and a pair of wires that went to the motor driver board. The female barrel connector is meant to receive the 12v power from the power supply while the male barrel connector plug is meant to connect to the Arduino.

On the other side of the stripboard, the wires are connected in series with no other fancy electrical components. In hindsight, it might have been better if I added a few components such as switches and LED but that would be for some future project.

Implementing the motor driver and adding motors/pumps

Arduino Uno alone can be used to turn on a motor with a combination of:

  1. Digital and analog pins
  2. Electrical components such as resistor, diodes and transistor
  3. External power supply
  4. Breadboard or stripboard

You can even find a tutorial for the above here. What this does is to use the Arduino to generate a signal to control the transistor to run the motor at full speed or turn it off. It does not allow for reversing the motor spin or controlling its speed.

Do not power and drive motors directly from any of the Arduino pins. They do not provide sufficient power to drive motors nor do they have the necessary protection against counter EMF from the motor.

Alternatively, we could go with a motor driver board that comes with all the necessary components installed that will allow us to control multiple motors with ease. It could also allow us to control the speed of the motors and the direction of the speed.

And the second option is what we will go with.

The Grove I2C Motor Driver v1.3 by Seeed is a good choice as it supports driving two motors from its twin channels. The motor driver comes with the L298 IC chip which is a dual full-bridge that support high voltage and current designed to drive inductive load such as relays, solenoids, DC and stepping motors. It also has an ATmega8L chip to enable I2C connection, making it very compatible with Arduino.

We can connect the motor driver to an Arduino using the I2C connection via the Grove port and some jumper wires. But since I don’t have a spare Grove shield, I have to go with the alternative connection. The SCL and SDA pins on the Arduino Uno are A5 and A4 respectively. With that in mind, we can connect the yellow and white wires to those pins. Then connect the 5v and ground pin.

Now we are ready to test the driver with an actual motor or pump.

But first, we will need to connect the wires to the terminals on the motor.

We could also solder the wires to the terminals to ensure a more secure and stable connection. You can see the result below after soldering and the pump is installed in the housing.

After connecting the pump to the motor driver, we could start testing whether the driver works and understand better how to use the hardware.

First, connect the Arduino to the computer. Then upload a BareMinimum sketch to the Arduino before we connect the motor driver. This will remove any existing programs on the Arduino and ensure there is nothing interfering with the driver.

Once the upload is completed, remove the black jumper on the driver. This step is necessary before we power up the driver with the 12v supply as failure to do so will mean 12v going into the Arduino via the 5v pin. Since there’s no electrical protection through that pin, we could fry the Arduino. Once the jumper is removed, we can connect the motor driver to the 12v supply. Alternatively, you could always use a battery to barrel jack adapter such as this and hook up a battery with voltage anywhere between 9v to 12v.

The image below shows the motor driver hooked up to the Arduino with the black jumper removed. The pump is also connected but it’s still in its bubble wrap.

Experimenting with and testing the pump

Based on the specification of the pump, it could do 100ml/min. However, it’s always better to test it and see for yourself the performance of the pump. But before we could do any experimentation, we will attempt to get the motor driver running.

The green LEDs should light up once the driver is powered and running.

But, if you see red LED light up alongside the green, it means that the driver is not initialised properly.

To fix it, simply press the white RESET button on the driver itself while it is connected to a powered Arduino. Everything should work after.

During the various testing session, I noticed that the motor driver always have some initialisation problem after I deploy a Sketch to the Arduino or when it is freshly powered up. That meant I will need to press the RESET button at least once if I want the driver to work properly. This is a problem but we will cover more later as it also affect the housing design. For now, we will continue with figuring out the water flow rate.

To do that, we will need a container filled with water, a measuring beaker and a small medicine cup. After connecting the silicon tubes to the pump, insert the suction tube into the water container.

But how do we know which is the suction end?

For the Gikfun pump that I’m using, the suction end is on the right side if the two tube connectors are facing you. Alternatively, you can determine the suction end by running the pump straight from a 12v supply and check which tube takes in the water. Always ensure the other tube is placed in the measuring beaker. We don’t want to spill water all over the place, especially not when electronics and electricity are involved.

Once we know which end of the pump takes in water and output water, we can start testing how much water is being output.

We can upload a simple Sketch that spins the motor in the pump for a certain amount of time.

#include "Grove_I2C_Motor_Driver.h"

#define I2C_ADDRESS 0x0f

void setup()
{
  Serial.begin(9600);
  Serial.println("Starting motor");

  Motor.begin(I2C_ADDRESS);
  Serial.println("Motor Initialised.");
  delay(2000);
}

void loop() {
 
  Serial.println("Running pump 1");
  Motor.speed(MOTOR1, 100);
  delay(15000); //15 seconds
  Motor.stop(MOTOR1);
  Serial.println("Motor stopped.");
  
  delay(1000UL * 60UL * 5UL);
}

For the test, I went with 15 seconds and ran the pump at 100% for my first experiment. The reason behind for the 15 seconds was based on a gut feel that anything lesser would not be sufficient to water the plant and was kind of pointless to check them out. The 5 minute delay is there to enable us to take measurement of how much water is dumped into the beaker.

Since the smallest value on the beaker was 100ml, there wasn’t sufficient water in the beaker to measure. This is where the medicine cup comes in since the smallest value on it is 1ml and the largest value is 15ml.

I poured the water from the beaker into the medicine cup until it reached the 15 ml mark and checked the beaker. There was no water left in the beaker.

The delay is then increased to 25 seconds and I repeat the measuring process. It turns out the pump put out about 25ml in 25 seconds. This is completely different from what is specified.

If the pump could do 100ml/min, that means it could do 1.6ml per second. At 25 seconds, I should see 41ml of water in 25 seconds and not 25 ml. This just prove that taking actual measurement is always necessary.

With this discovery, I could then determine how long to run the pump for based on the size of the pot, the type of soil and how dry is the soil. This shall be the topic for another article.

Housing the pump station and getting ready for integration

For continuity in design, I will be using Lego bricks again. Previously, I got the Lego Brick Bricks Plate version which comes with 1,500 pieces and some base plates with the intention of doing this project. And I also had left over Lego bricks from my other IoT project that I will be using first.

After choosing the right base plate, the power module was installed first. The bricks were installed with the idea that they should hide the power module since it’s relatively ugly. Once that’s done, the motor driver goes in and the bricks are added.

Below is the initial design that I had before I realised it couldn’t support two pumps.

After some more hours of work, the first pump is now installed and walls went up. The motor driver was moved to the center of the base plate.

This is the side view. As you can see, cable ties are used since the pump has a dimension that’s incompatible with the Lego bricks.

After adding the second pump, the pumping station is now mostly completed.

It was only after the housing is done when I came to the realisation that I need a quick way to access the RESET button as I will be integrating it with the soil moisture monitoring system and will need to deploy new Sketches every now and then. And, I also need to take into account whenever there is a lightning storm, the circuit breaker in my house will trip. In that scenario, the motor driver will probably need a reset.

The good news is there are ISP headers on the board that we can use to perform the reset.

Now, the RST is pulled high by default. To perform the reset, we need to connect it to ground, thereby pulling it low. And we can achieve that with a push button, connecting the RST pin to ground.

So, it was time to take out the motor driver from the housing and solder on the pin headers.

Then, we can prepare a push button and solder the wires on.

Once everything is hooked up, we shall have a quick access reset button.

Then, we will power up the motor driver and test the reset button to make sure it works. As usual, the red LEDs on the driver will light up once powered on. It is because of the driver’s failure to establish the I2C connection. This is when I connected a powered Arduino to the driver and then pressed the reset button. After a while, the red LEDs disappear and that means the I2C connection is established.

It was discovered later that there is a bug with the motor driver’s firmware. It crashes when something that is not hardcoded in the firmware is sent to the driver. This causes the I2C connection to fail. The good news is that a new firmware has since been made available and we could update it by following the instructions here.

Now that I’ve proven the push button to reset works, the housing is then modified to incorporate the button. I had to use several layers of paper to get the button to sit tightly in the window brick.

With that, we can start integrating with the soil moisture monitoring system. The two lego housing are first combined but we won’t be connecting the motor driver or powering it up as we still need to update the sketch program currently running in production.

Here are some picture of the upgraded system.

This marks a good point to stop this article. In the article covering Part 2 Phase 3, we will look at how we setup the tubes around the plant and update the existing sketch program. And let’s not forget that the article for Part 2 Phase 2 is also in the works. In that article, we will look at how to upload telemetry data from the Arduino to a dashboard.

An IoT project: Monitoring soil moisture (Phase 1)

I have several pots of Dracaena Sanderiana (Lucky Bamboo) at home and I struggle with watering them.

The genus loves moderately moist soil. Overwatering will lead to root rot and them turning yellow while under-watering can lead to dry soil and the plant can die too.

This is why I got the idea to build an IoT solution that will tell me when the soil is too dry or too moist. This way, I would know when to water and have a rough gauge of how much to water.

Hardware and the plan

This project is split into three different phases.

For the first phase, it’s about being able to read the soil moisture information and present it. It should also notify me about low soil moisture.

To get started, we will be using the following hardware:

  1. Arduino Uno
  2. Sparkfun Soil Moisture Sensor (SEN-13322)

Rather than getting other type of sensors individually, I went with the Seeed Arduino Sensor Kit. It comes with a collection of common sensors such as button, accelerometer, humidity and temperature sensors. It also come with a Grove Base Shield that allows me to connect the various sensors easily via the Grove Connectors.

From this kit, I decided to use the buzzer to serve as the notification device. Whenever the moisture level drops below certain value, it should buzz and let me know. To present the moisture information, I went with the small 128 x 64 OLED display.

As I don’t want to burn out the display by keeping it turned on 24×7, I also hooked up the button. When pressed, the button will generate a signal that will turn on the display for a few seconds before it switches off.

For power, I got a power supply with a 5v output and barrel connector instead of powering it via battery.

For the second phase, I will look into implementing WiFi communication for the Arduino so that it could send back soil moisture data back to a database. This way, I could build a dashboard for me to see the history of the soil moisture.

For the third and also the last phase, I will incorporate an automatic water pump system that will help me water the plant when the soil moisture dip below certain level. This way, I could free up more time for me to do more projects.

Implementation (Writing the code)

With the documentation and code provided by Sparkfun for the soil moisture, I managed to get the soil moisture sensor to start reporting its first set of values.

The sensor is powered by digital 5v on Arduino digital pin 7 (D7). The data pin, also known as signal, is hooked up to A0 of the Arduino analog pins.

To read value from the sensor, it must first be powered up. This is done by first setting up digital pin 7 as the OUTPUT on the Arduino and this is done in the setup function.

Then, the moisture value is read from A0 with the use of analogRead in a separate function. After which, D7 is turned off

To get a better picture of the values that the sensor returns, Serial.println() is used to send data back to the Serial Monitor of the Arduino IDE over the UART. When the sensor is not touching anything, the values returned were between 0 to 10. Then, I touched the sensor’s prongs with my moist hands and the moisture reading went anywhere between 800 to 895. I even placed the sensor into a bowl of water and I get the same range of 800 to 895. A dry hand returned values between 300 to 500.

With those values, I worked on implementing the buzzer, which is connected to D5 on the Grove Base Shield. Based on my understanding of the plant, if the soil moisture is between 300 to 600, it’s probably too dry.

In that case, the program will call the buzzUrgent function and the buzzer will go off twice.

Initially, I had it go off three times with 1.5 seconds delay using a while loop. However, it was advised by the Arduino community that delays should be used sparingly since it’s a blocking action and will prevent the Arduino from doing other stuff. Thus, I made the change to use shorter time delay to create the rapid beeping sound that should catch my attention.

The program will also record the time (in miliseconds) the buzzer had gone off. And, if the soil remains dry, the buzzer shall continue to go off twice, once every half an hour (in milliseconds) as defined by the following variables.

On the other hand, if the soil is too wet, the buzzer will go off too. Initially, I went with the value of 850 because that was the highest I got when the sensor was in a bowl of water. Several days later, the buzzer keep going off while the sensor was in the soil and the arduino restarted after I powered it off as I was switching the power socket. When I checked the value read by the sensor, it was hovering between 850 to 890. With that, I had to change the value from 850 to 900.

During the development process, the Arduino, sensor and the OLED display are kept in a plastic container. Once the main development is done, we will look into building a housing.

Next thing that I implemented was the button. I needed a way for me to turn the display on only when I need it and turn off the display after a certain amount of time. This was to protect the display, preventing it from burning out.

I connect the button to D4 Grove socket on the Grove Base Shield. Then, the code to detect if the button is pressed is added to the loop function.

The isButtonPressed is a simple function that checks if the button state is high. If so, returns a true, otherwise a false.

With the button implementation, the program will cause the OLED to activate for 5 seconds every time I press it. This should give me enough time to read the value.

One last thing to mention is that the loop function previously had a blocking delay at the bottom. I removed that in favour of the non-blocking timer. What it does is basically get a soil moisture reading once every 30 seconds. For the initial prototype, this is probably a good amount of time since I don’t want to be waiting for ages before I get a new value. Once the system is stable, I would increase the delay between the sensor read further. Once every thirty minutes is not out of the question since soil moisture level don’t change so fast (unless we are in a heatwave and drought).

The code to the project can be found here.

Problems and Troubleshooting

During the implementation, I came across an issue that got me wondering if there was an issue with the hardware.

When I first connected the OLED display, I couldn’t get it to work. After nearly one day of research and then trial and error, it turns out it was because I’m using Arduino Uno. Based on the README provided here: https://github.com/Seeed-Studio/Seeed_Learning_Space/tree/master/Grove%20-%20OLED%20Display%200.96”(SSD1315)V1.0, software I2C should be used instead.

However, since I’m using the Arduino_Sensorkit library, there’s no direct way of using software I2C. More research pointed me to this commit history at the Arduino_SensorKit repository on Github

With that, I located version 1.0.5 that had support for software I2C and installed it. I retested the program and the OLED display turned on. I could then display information on it.

Housing and deployment

Like the previous IoT project that I did, I went with using Lego for the housing.

Fun Fact: The button was a last minute addition after the housing is done when I realised I needed a way to turn the display on after it has gone to sleep. That’s why it’s dangling outside. A re-design of the sensor housing on the right is necessary.

As you can see below, the buzzer, OLED display and button are connected to the Grove Base Shield via the Grove connectors while the moisture sensor is connected using jumper cables.

This is the finished product. The Arduino is hidden away within the box. The top part is actually a moveable flap that can give me quick access to the hardware. The OLED display is placed on the side with the use of double-sided tape to stick it to two transparent panel bricks. A roof is added to “protect” the display from water splashing on it from the top. On top of the display is the button that allows me to press and turn on the display for a quick glance of the moisture value.

At the back of the housing, there’s a compartment for the buzzer. Lego bricks that look like a window are used to create an opening for the sound to exit instead of sounding muffled.

Once the housing is done, the project is finally deployed. But first, the sensitive part of the moisture sensor had to be protected from water. So I modified an anti-static bag and used it to wrap around the top part of the sensor with the application of a decent amount of sticky tape.

Here is the OLED display in action after the button is pressed. It shows the current soil moisture level. Based on the value, looks like I don’t need to water the plant just yet.

An IoT project: Location check-in and temperature recording using facial recognition

With the COVID-19 pandemic, temperature checking before you enter a premise is a necessary step to ensure you are well and help keep others safe. In Singapore, it is a requirement from the government.

One of the most common way is to use a handheld digital thermometer to find out your own temperature and recording it on some kind of logbook placed near the front door. For a workplace, employees are encouraged to record their temperature twice a day.

This has two potential issues.

The first issue is the amount of surface a person come in contact with: thermometer, logbook and stationary used for recording.

The second issue is the logbook. It is a collection of papers printed with tables for people to put their entries. It could get very unwieldy as people need to flip through the papers to locate an empty slot if it’s their first entry for the day or their first entry of the day and record their temperature for the second time.

How are we going to solve this?

The general idea goes like this:

  1. Detect a face
  2. Check that the person is in the database
  3. If the person is in the system, record their temperature and create an entry with the current timestamp in the system.
  4. Otherwise, notify the admin of the unknown person via email with an attached form to register the person
  5. Once the person is registered, they will be checked-in.
  6. The data will be stored locally for future retrieval.

With the general idea in mind, the next thing to decide on is the hardware that we will be using to implement this solution.

There are many great options these days as computing technology has come so far: tablet such as an iPad, a Raspberry Pi, a Nvidia Jetson or even a decent laptop.

An iPad is an expensive solution. The cheapest model cost US$329. Furthermore, there’s only a limited set of hardware we can use with the iPad in order to capture and record temperatures. One such accessory is the FLIR ONE for iOS Personal Thermal Imager. This accessory is expensive, costing US$260 and is not necessarily available for purchase in Singapore without some kind of import restriction. However, it is something that probably requires the list amount of boilerplate work since Apple has created some of the best API for doing facial recognition and machine learning work.

Nvidia Jetson is another possible option. It cost about the same as a Raspberry Pi and comes with good software support. The hardware comes with a 128-core GPU that could easily churn through images and video feeds without any issue. There’s also a strong community support which could make it easier for us to search for information and troubleshooting issues.

Raspberry Pi ranks about the same as Nvidia Jetson in terms of price and purpose. However, there are a few aspects the Raspberry Pi, especially version 4, edged over the Jetson. The first is its faster, newer and more efficient CPU. The second is power consumption. The Raspberry Pi 4 consume at max 3.4 watt of power compared to 10 watt on the Nvidia Jetson. One could attribute that to the less powerful but efficient VideoCore by Broadcom on the Raspberry Pi. Lastly, it is the size. Nvidia Jetson is rather big due to its need to support a large heatsink and will need a bigger case to house it.

The Hardware and Software

Raspberry Pi 4 won out at the end because we don’t need to do heavy AI or robotic works the Nvidia Jetson is intended for. Furthermore, I could always reuse the Raspberry Pi again for future projects that are more general purpose. Lastly, it is also cheaper than an iPad with the FLIR camera accessory even after taking into account we have to buy the camera, sensor and housing.

Since I got the Raspberry Pi 4 as a hobbyist set from Labists, it came with Raspberry Pi OS. A quick setup process was all it needs to get started.

For the camera, I went with the Raspberry Pi’s NoIR Camera module to capture video feeds and do facial recognition. The initial reason for that was I had assumed it could pick up on temperature difference. However, I was proven wrong when further research suggested that the lack of IR filter on the camera was to allow it to see better in low-light conditions. I saw an opportunity here. I could deploy this solution for use at night or in areas where there’s poor lighting.

Now that I needed a way to pick up temperatures more accurately, several hours of research pointed me to using an AMG8833 thermal sensor. I found one from SparkFun and it was readily available locally. There are other more powerful and probably more accurate thermal sensors or cameras such as the SparkFun IR Array breakout – MLX90640 but they cost more and some are out of stock.

Now that we’ve got the hardware figured out, we need to determine what kind of framework or software we can use for the facial recognition part.

I decided upon OpenCV as it was something I’m familiar with and it comes with good support for the Raspberry Pi. A quick google search will give you a number of results.

The latest version of Python 3 (v3.9 at the time of writing) was used.

The following libraries were also installed to reduce the need to write boilerplate codes just to get the facial recognition functionality working:

Implementation of the facial recognition, thermal reading and check-in

You can see the architecture diagram for the solution below.

From the architecture, we can see that there are two facial recognition-related modules running on the Raspberry Pi. I implemented them based on the solutions provided by Adrian Rosebrock at his website.

For detection of faces, the Haar Cascade classifier is used with KNN classification algorithm.

The Face/Name Detector is responsible for building up the face/name store which will then be used for the check-in via face identification later. It scans a specific folder. Within this folder, there are sub-folders that are named after the person which contains his or her pictures. Below is an example of the folder structure.

The Face Recognition with Check-in module is as its name suggested. It pulls the feed from the camera and check each frame for a face. Once it found one, it will check against the face/name store. Only then, it will read the temperature detected by the thermal sensor and proceed to record the person into the temperature store, which is nothing more than a .csv file.

When it comes to the thermal sensor, I relied on information gleaned from this site. The GitHub repo provided us with the source code necessary to work with the sensor and formed the basis of my implementation.

Once the system detects a face that is known and their temperature recorded, the system will store it in the temperature store. Below you can find the flowchart that describe the execution.

And for those who like to read the source code to the project, it could also be found here.

Building the housing

I had a rough plan for how to house the hardware. It goes like this:

  1. Need a compartment to house the Raspberry Pi
  2. Camera should be placed high enough
  3. Thermal sensor should be able to see
  4. It should hold a display.

There is always the option of using a 3D printer but I knew it would probably take me way too long to figure out how to put everything together. So I went with the one thing I know: LEGO. It gives me the flexibility to change my design whenever the situation demands it while also giving me the power to make whatever I want. All I needed was the right kind of bricks and the right amount.

Below is the first prototype. Its sole purpose was to house the camera and Raspberry Pi.

The above design cannot scale but it gives me the foundation to go further. I changed the Raspberry Pi housing and added more compartments to house the 7inch touchscreen display and thermal sensor.

It was during this time when I realised I’m lacking the right type of jumper cables to wire up the display to the Raspberry Pi. I brought out the breadboard I purchased before to use it as the connection hub. I build a separate compartment to house the breadboard which would be used to connect all the components together via wires. So this is how the following design came about.

Testing

Since this is meant as a prototype and learning purpose, I didn’t get around to do unit testing and all that stuff. My focus was only on getting things to work. I only did some application test to make sure whatever I’ve implemented was working.

Since I was developing primarily on my Macbook Pro, I tested the face identification implementation on it with the webcam as the source of the video feed before deploying it onto the Raspberry Pi.

After the thermal sensor was installed, I got to see for myself the effectiveness of the thermal sensor. The reference repo contains two example Python modules (IR_cam_interp.py and IR_cam_test.py) that I could use to check if the thermal sensor was working.

Using the modules, the sensor could pick up my skin temperature to be around 34 degrees celsius at about 10 cm. Going slightly further away at 15 cm, the temperature detected dropped to about 33 degree celsius. Any further, it becomes harder to get a more accurate read without some kind of offset value added. Thus far, I tried 2.5, 6.0 and 8.0. The latter gave the best accuracy. But this also meant that placing the thermal sensor at the top of the camera isn’t really a good implementation.

What’s next?

Since the thermal sensor don’t give very precise reading for faraway object like a human body, another approach is required. Below you can see a drawing of the improvement that I could make.

Other than that, the solution is also lacking the ability to register a new/unknown person into the system via some type of web interface or from another computer. What I did was to manually upload the pictures and run the Face/Name Detector to build the database.

Last but not least, I could also consider adding a backend support via Microsoft Azure to collect the recorded temperature, alerting users to new unknown users and to enable downloading of the stored record.