Sensor Data: Storage & Query

As a participating “GROW Place” in this EU Soil Sensing Mission, my farm is generating data from a network of soil sensors we deployed around the farm last year. These sensors log a reading of 4 different indicators (soil moisture, temperature, light exposure and fertility) once every 15 minutes, so that’s nearly 100 lines of time series data every 24 hours. With some 150 sensors logging data at that rate for nearly a year already… That’s a few million records at this point, which is rather a lot to manage in spreadsheets, as you can imagine!

So: i know that farmOS has some capability for storing sensor data -which sounds great, if it would enable me to easily correlate that data with production data along the line of time- but i wonder if the system is able to handle such a volume of data (my instance is hosted at Farmier, NB). In the first place, is bulk upload of records on that scale possible? And if so, then what sort of query capability does the system provide? Alternatively: could i perhaps access the database with my choice of SQL query tools (given read-only access of course, to a view that might constrain my ability to run very processor-intensive queries)? And then export my custom-generated reports in .CSV files that i might bring into R Studio for further analysis?

Generally speaking: i’m happy to see all the different structured data types that farmOS is able to digest, and the affordances (including mobile) for entering such data one record at a time… But if there be any upper limits on volume of data stored, and/or time to process queries across a large number of records, i need to know at this point about any such constraints. ?

1 Like

Great questions @walt!

The short answer is: sensor data is stored in an SQL database(* by default), so you are only limited by what an SQL database can store/manage - which is a lot. The amount of data you described shouldn’t be a problem at all.

You can find more information about using the farmOS Sensor module and API here: https://farmos.org/guide/assets/sensors/

Now for a more nuanced answer: farmOS takes the same modular approach with sensors as it does with everything else. It comes with two module: Farm Sensor (which provides the basic “Sensor” asset type), and Farm Sensor: Listener (which defines a “Listener” sensor type). Sensor assets can have a “type”, which extends how they work, and what they can do. The Listener type provides a very simple API endpoint for each sensor asset that accepts data points posted to it via HTTP requests (see docs linked above). It stores these in a database table that the Listener module provides: https://github.com/farmOS/farmOS/blob/7.x-1.x/modules/farm/farm_sensor/farm_sensor_listener/farm_sensor_listener.install#L12

With regard to querying the data that a Listener sensor stores in the SQL database, there are a few options. If you are self-hosting farmOS, and you have direct access to the database, you can perform raw SQL queries on it. Alternatively, you can query the sensor data via API requests as well (see docs linked above). You can also download the data from a sensor as a CSV (with optional date range filtering) through the farmOS UI (this is essentially what you described as a read-only view, except that it is through the UI, not SQL directly).

For most use cases, this listener will work fine. If you have a use-case where this simple approach won’t meet your needs, the next option is to create your own sensor type. When you do this, you can take whatever approach you want to data storage, API surface, and presentation. You don’t have to use the default farmOS SQL database if you don’t want to. You could push into a different system entirely. See these two threads for some more info/ideas:

Thanks for the very substantive answer, @mstenta : gives me solid info and options that will take me some time to digest. The one thing that i’m not seeing on quick perusal of those docs is an answer to my question about the bulk upload of these millions of records history i’ve already accumulated -not the sort of data transfer that the Listener is designed for, unless i’m misunderstanding this part. Can it in fact receive such a transaction? or would it have to be set up as a series of transactions, or how would that work?

1 Like

@walt Ah yea you’re correct: there isn’t an easy way to bulk upload data points via the API or farmOS UI. That would need to be done with raw SQL. If you are interested in importing bulk records into a Farmier-hosted system, contact hello@farmier.com.