As a participating “GROW Place” in this EU Soil Sensing Mission, my farm is generating data from a network of soil sensors we deployed around the farm last year. These sensors log a reading of 4 different indicators (soil moisture, temperature, light exposure and fertility) once every 15 minutes, so that’s nearly 100 lines of time series data every 24 hours. With some 150 sensors logging data at that rate for nearly a year already… That’s a few million records at this point, which is rather a lot to manage in spreadsheets, as you can imagine!
So: i know that farmOS has some capability for storing sensor data -which sounds great, if it would enable me to easily correlate that data with production data along the line of time- but i wonder if the system is able to handle such a volume of data (my instance is hosted at Farmier, NB). In the first place, is bulk upload of records on that scale possible? And if so, then what sort of query capability does the system provide? Alternatively: could i perhaps access the database with my choice of SQL query tools (given read-only access of course, to a view that might constrain my ability to run very processor-intensive queries)? And then export my custom-generated reports in .CSV files that i might bring into R Studio for further analysis?
Generally speaking: i’m happy to see all the different structured data types that farmOS is able to digest, and the affordances (including mobile) for entering such data one record at a time… But if there be any upper limits on volume of data stored, and/or time to process queries across a large number of records, i need to know at this point about any such constraints. ?