WIP [ITP Blog]

Intro to Physical Computing - Week 15 [ChatBotany Final]

December 12, 2018

For my final physical computing project, I built ChatBotany, an online chatbot that allows you to interact and converse with a plant. The project imagined what a conversation with a plant could be like and how it’s personality could be affected by environmental inputs like soil moisture levels. ChatBotany also provided an opportunity for me to work with a Raspberry Pi because of its use of a Node.js web server to operate the Facebook Messenger chatbot. I hope to continue iterating on this project and open sourcing its code so anyone could setup a chatbot with their own plant. This could eventually allow plants to have a “chat” conversation among themselves and a human input-less self-reliant care system.

Past projects that served as both references and inspiration include:

Idea Testing - Tamabotany

The first version of ChatBotany was called Tamabotany and it was a simple LED and serial communication prototype for a plant-based chatbot. Our class did a play-testing user session with another physical computing class to collect feedback on our project ideas. The LEDs represented the different inputs/actions that could be activated through a chat interface, blue for water and yellow for sunlight. The piezo speaker activated when the “virtual hug” button was clicked to demonstrate how the digital act of pushing a web UI button could produce an action in the physical world.


led placeholders

test setup

web screenshot

user testing


Feedback collected from this play-testing session eventually transformed Tamabotany into ChatBotany. Users felt emotional connections to the plant open up once there was a way to converse with the plant. Adi also proposed looking into Facebook Messenger’s API so that I could use a familiar and feature full chat platform. Ashley made the critical point that there would need to be some calibration if this project was shared with others because each plant has different kind of needs (e.g. a succulent needs less water than a fern).

But it was Danny’s comment to explore the topology of a chatbot plant network that made me think about this project’s potential future. He asked what would happen if the project expanded beyond a one-to-one, human-to-plant relationship? What happens when a plant can chat with another plant? Can a group of plants take care of each other in a more personal fashion — in a way that moves beyond an automatic sprinkling system and has one plant actively choosing to water its peer? While I wouldn’t have time to realize these possibilities in the winter show version of ChatBotany, they’re still questions and thoughts I’d like to pursue as I continue to iterate on this project in the future.

Early Component Testing

I then purchased and tested the components I would need to make an online, plant-based chatbot that could read its soil moisture level, trigger a watering action from a digital signal, and playback soundclips. I started by recreating the transistor circuit from my midterm project so that a digital signal from a microcontroller could activate a DC motor water pump.

transistor circuit

I then built test circuits for the 12V DC water pump. My initial plan was to use my microgreen bin as the connected plant to the chatbot, so I also tested whether puncturing holes in a rubber tube could achieve a sprinkler effect.

water pump


I then found these soil moisture sensors which could indicate how much water a plant is getting. These sensors output both an analog and digital data, so I tested reading both using an Arduino Uno.

moisture sensor

The final component I tested was a Raspberry Pi device. The ITP shop has Raspberry Pi Zero W+ kits available for rent, so I spent a day working through Tom’s tutorial on setting up an internet connected Pi. I will follow up this documentation post with a more detailed blog post on configuring a Pi to run a web server that’s publicly accessible using a Node.js Express web server running on the Pi and Nginx to route the web traffic.

pi zero

I attempted to follow this tutorial on reconfiguring the Zero’s pins to support audio out, but couldn’t get it to work out. I wasn’t allowed to solder any pins to the shop’s Pi, so I think the audio didn’t work because of lose connections to the GPIO pins. Luckily I ordered for this project a Raspberry Pi 3 B+ which has an audio line out port built into it.

pi zero sound

Prototyping With A Raspberry Pi

Once I had a Pi 3, I started to convert the Arduino scripts into Python and then later Node RPIO scripts (so they could be executed from within a Node.js based web server running on the Pi).

pi 3


When testing audio playback, I used a junk shelf speaker and used this amplifier circuit to increase the speaker’s volume.

audio line


During an office hours session with Danny, I realized the Raspberry Pi doesn’t natively support analog input. So I first tried implementing an rc-circuit which produces an analog value by measuring how the variable resistance of an analog sensor affects the time it takes to fill and drain a capacitor. This approach worked for a light sensor, but I ran into issues with the soil sensor because it’s default state is a high voltage output reading.

cobbler sun

So then I tried using the MCP3008 analog to digital converter that I purchased from Tinkersphere which uses SPI to convert analog values into signals that digital pins can read and recalculate the original analog value.

moisture sensor analog

I couldn’t get the ADC chip to consistently return usable analog values, so I opted to the use the soil moisture sensor’s digital output in the end. The sensor includes a potentiometer through which a threshold moist level can be set.

I also tested programmatically activating a webcam on the Raspberry Pi because someone mentioned during user testing that it would be fun if our plants could share selfies. The first webcams I tried were normal computer USB webcams, but was unimpressed with their quality and found working with the fswebcam JavaScript API confusing. So I purchased a Raspberry Pi webcamera that connects to the device over CSI and has a more robust API.


Below is a diagram of the final circuit used in this project. Because I wanted to keep the circuit size compact, I decided to drop the amplifier circuit and use an externally powered speaker instead. ChatBotany’s flow works as follows:

  • User sends a chat message asking the plant how is it doing

  • The digital value of the soil moisture sensor is read

    • If the sensor returned dry, a message reply is sent to the user with buttons to water the plant
    • If the sensor returned wet, the plant responds that it’s doing fine.
  • When the user clicks one of the water buttons, a digital HIGH signal is sent to the transistor which allows power to flow through the DC water pump. The button’s water value determines how long the DC water pump will be activate for, more water means send a HIGH signal for a longer time.

  • User sends a selfie image message to the chatbot server

  • Chatbot parses out the message as an image type, captures a plant selfie using Raspberry Pi webcam

  • Chatbot sends webcam image file to Facebook in message reply

  • User sends an audio recording message to the chatbot server

  • Chatbot parses out the message as an audio type, downloads audioclip from the message audioclip URL

  • Chatbot plays downloaded audio file for the chatbot, sends message response acknowledging it listened to the audioclip



I then built a cardboard prototype enclosure for the next play-testing session. A lot of the feedback revolved around fleshing out the plant’s language/personality in the chat conversation and building a rustic, approachable enclosure that could match and be a physical extension of a plant’s personality.



user testing


The next few photos document how I fabricated the enclosure and plant presentation for the winter show. I used a discard wine crate for the box, tested different lids and covers until Ben suggested to keep it simple and have the components inside, designed and laser etched a logo, dealt with water spills, and stained the box for my first attempt at finishing a fabrication project.

box sketch

wine box


acrylic lid

lid proposals


lathe close up



laser cutting cardboard panel



all set

wood stain


logo sketches


etching logo

painting logo


Here are some photos of the final ChatBotany project that I presented at the ITP Winter Show 2018.



webcam speaker

soil sensor


And some videos demonstrating the plant communicating with a human through Facebook Messenger.

Future Thoughts

Some thoughts I had as I continue to iterate on ChatBotany, particularly after all the feedback I received from winter show guests:

  • The chatbot needs to be able to respond to different phrases that ask the same thing (e.g. “are you dry” vs. “are you thirsty”)
  • Allowing users to name their plant and configure its need (e.g. “thirst” levels), effectively creating plant personas and opening up the possibility of plant-to-plant communication
  • Working with an analog circuits or Raspberry Pi expert to get proper analog values from the soil moisture sensor
  • Incorporate additional sensors such as a temperature or humidity sensor that influence the plant’s tone (e.g. sending delirious messages when it’s too hot) and allow for additional actions a user can take through chat (e.g. turning on or off a light)

I will follow up this documentation post with a more detailed post on the ChatBot’s code and setting up the Pi to run a web server.

Adrian Bautista

A perpetual work in progress blog documentating my NYU ITP projects. Words are my own.