Deleting all sensor data - WX

Hi
I had my weather station push data to my farmOS instance for weeks now. There are metric tonnes of records in the db. I don’t see any use for this data, and want to clean it up.
Unfortunately, I can’t seem to find a way to just simply truncate all the weather station sensor data that was accumulated. If it’s easier, ALL sensor data is fine too, because that’s the only data I have been pushing in.
Any advice on which tables or where clauses to slam down will be valuable.
Thanks!

1 Like

Delete the data stream?
Maybe possible to delete parts from the api. I’m not sure…

2 Likes

I was going to suggest this as well. The only drawback is that you won’t be able to send data to the same data stream UUID anymore. So if you had that URL set up in your weather station device and you wanted to continue streaming you would need to update that.

1 Like

Thanks. Didn’t know deleting the stream would remove the underlying data, as well. Will do!

1 Like

Hi all,

This I will say may be a bit too much of ‘hard-coding’-like, but it might help you @marlonv

For the farm, we used to have a webpage system running a MySQL database. When we finally added sensors for tank level and main water pump action, the size of the dB went up crazy because we had set up very frequent logs in order to calibrate the sensors and tune the circuits around the Arduinos.

This is what I did few times to reduce the size of the dB:

  1. dump the database and save it as a backup to keep for future data reference of sensor datapoints.
  2. make a copy the file in (1)
  3. edit the copy by deleting all sensors entry in the correspondent section of the schema, leaving intact all header rows and the last log data row, and ensuring all end delimiters were left intact (i.e. ‘;’ ‘,’ ‘)’ ‘]’ ‘}’ )
  4. restore the database with the edited file (3)
  5. restart the system
  6. if everything had failed due to a broken schema, you can restore you system with file (1), and see if you want to try again steps (2-5)

That would reduce the size of the dB in about a 90%.

The ease of such dirty job prevented me exploring a way to automatize it.
Then, once we were happy with everything, we eased the data point login frequency and that helped reducing the dB trimming frequency.

– ‘Sometimes you have to go fishing in the river instead of trying to streamline your code’

2 Likes

Thanks Guillermo
That’s actually quite helpful. I was thinking of doing that in a Python script, but as you mentioned, it’s a finnicky deal. Will remember this.
On a side note, I think influx db is really more suited to this sort of “every second” data. My weather stations are just creating a LOT of data, and i want it to be searchable in an instant at any time.
Cheers and have a good weekend.

2 Likes