diy solar

diy solar

Getting Data DIRECTLY from a Tigo TAP - is it possible ?

Thanks! This was very helpful with the help of Google Translator.

I was able to get in and implant a cron job to export live data into influxdb. Been working fine for two days and after rebooting.

Next I'll integrate with Home Assistant and OpenEVSE. Fun!
Awesome. Let me know how you end up visualizing it in HA.

How are you running influxdb?
 
Awesome. Let me know how you end up visualizing it in HA.

How are you running influxdb?

I'm thinking of two different ways to do this. Haven't settled on one yet.

Option 1: Use the InfluxDB integration for Home Assistant to write a HA sensor that queries influxdb on an interval.

Option 2: Use telegraf to query InfluxDB on an interval that posts the "live" values to an MQTT topic.

I may just implement both options for my use cases. The MQTT option gives my EV charger the direct capability of diverting a percentage of the generated PV to charge my car. I suppose I could do it all in Home Assistant as well.

I run InfluxDB on one of my ARM devices. That's where all my sensor data resides. With option 2 I could output to multiple destinations, one being the free InfluxDB cloud service for redundancy.

Edit: I just realized I didn't answer your question of how I'll visualize in HA. When you create a sensor you can use it on the built-in Energy dashboard. I'll just link the sensor to the Solar Generation section. Should be easy. 😎
 
I'm thinking of two different ways to do this. Haven't settled on one yet.

Option 1: Use the InfluxDB integration for Home Assistant to write a HA sensor that queries influxdb on an interval.

Option 2: Use telegraf to query InfluxDB on an interval that posts the "live" values to an MQTT topic.

I may just implement both options for my use cases. The MQTT option gives my EV charger the direct capability of diverting a percentage of the generated PV to charge my car. I suppose I could do it all in Home Assistant as well.

I run InfluxDB on one of my ARM devices. That's where all my sensor data resides. With option 2 I could output to multiple destinations, one being the free InfluxDB cloud service for redundancy.

Edit: I just realized I didn't answer your question of how I'll visualize in HA. When you create a sensor you can use it on the built-in Energy dashboard. I'll just link the sensor to the Solar Generation section. Should be easy. 😎

I guess I could probably run the influxdb on the same Pi 4B as I am running HA? Otherwise, I have a Windows box that mostly just runs Plex.

I have data from Solar Assistant piped into the HA Energy dashboard. (Someday I'll probably replace SA with direct monitoring of Sol-Ark using RS485).
I was trying to figure out if there was a way to visualize all of the panels individually, but I'm still trying to wrap my head around YAML.
 
I guess I could probably run the influxdb on the same Pi 4B as I am running HA? Otherwise, I have a Windows box that mostly just runs Plex.

I have data from Solar Assistant piped into the HA Energy dashboard. (Someday I'll probably replace SA with direct monitoring of Sol-Ark using RS485).
I was trying to figure out if there was a way to visualize all of the panels individually, but I'm still trying to wrap my head around YAML.

Oh I see. You're asking about visualizing each panel in HA. Yeah not something I want to do in HA. I intend to visualize that in grafana as a next step.

Probably an overlay of panel values over an image of my roof? I dunno. Maybe that would be too tacky.
 
Last edited:
oof. That was both easier and harder than I expected.
I just installed InfluxDB into HA. This makes it easy to pass data from HA to InfluxDB, but makes it harder to get data directly into the InfluxDB instance.
With the help of ChatGPT, I created a bash script to parse
Code:
/mnt/ffs/data/daqs
and then pass the info for each optimizer to HA using the REST API (which can be called from curl).
Once in HA, I can pass the info to InfluxDB.

This worked!
But it was made more complicated because:
  • Apparently this thing runs BusyBox and doesn't have a complete version of grep, so the initial parsing method didn't work.
  • For... reasons... crond is using a file that is NOT where crontab looks. So crontab -e doesn't edit the right thing.
  • I created HA sensors for each optimizer, that then have attributes for power, voltage, etc. Although this is cleaner, I haven't figured out how to get InfluxDB to understand this (yet).
 
oof. That was both easier and harder than I expected.
I just installed InfluxDB into HA. This makes it easy to pass data from HA to InfluxDB, but makes it harder to get data directly into the InfluxDB instance.
With the help of ChatGPT, I created a bash script to parse
Code:
/mnt/ffs/data/daqs
and then pass the info for each optimizer to HA using the REST API (which can be called from curl).
Once in HA, I can pass the info to InfluxDB.

This worked!
But it was made more complicated because:
  • Apparently this thing runs BusyBox and doesn't have a complete version of grep, so the initial parsing method didn't work.
  • For... reasons... crond is using a file that is NOT where crontab looks. So crontab -e doesn't edit the right thing.
  • I created HA sensors for each optimizer, that then have attributes for power, voltage, etc. Although this is cleaner, I haven't figured out how to get InfluxDB to understand this (yet).

Yes, normal crontab editing will not work. Simply edit the crontab located at /mnt/ffs/etc/crontab.

Here is the script named data-to-influxdb.sh that I use to push data to my standalone InfluxDB instance:

Bash:
#!/bin/sh

INFLUXIP="192.168.0.200:8086"
ORG="homelab"
BUCKET="tigo-pv-stats"
TOKEN="MySecretToken"

DATA=$(getinfo --dir /mnt/ffs/data/daqs --prefix daqs. | tr '\n' ',')
NEWDATA=$(echo $DATA | cat | tail -c +34 | head -c -9 | sed 's/=\,/=0\,/g' | sed 's/\,TimeStamp=/\ /g')
NEWDATA=$NEWDATA"000000000"

curl -XPOST "http://$INFLUXIP/api/v2/write?org=$ORG&bucket=$BUCKET" --header "Authorization: Token $TOKEN" --data-raw "Tigo-ACC $NEWDATA"

My crontab has this entry:

Code:
* * * * * /mnt/ffs/bin/data-to-influxdb.sh

Hope this helps.
 
Last edited:
Yes, normal crontab editing will not work. Simply edit the crontab located at /mnt/ffs/etc/crontab.

Here is the script named data-to-influxdb.sh that I use to push data to my standalone InfluxDB instance:

Bash:
#!/bin/sh

INFLUXIP="192.168.0.200:8086"
ORG="homelab"
BUCKET="tigo-pv-stats"
TOKEN="MySecretToken"

DATA=$(getinfo --dir /mnt/ffs/data/daqs --prefix daqs. | tr '\n' ',')
NEWDATA=$(echo $DATA | cat | tail -c +34 | head -c -9 | sed 's/=\,/=0\,/g' | sed 's/\,TimeStamp=/\ /g')
NEWDATA=$NEWDATA"000000000"

curl -XPOST "http://$INFLUXIP/api/v2/write?org=$ORG&bucket=$BUCKET" --header "Authorization: Token $TOKEN" --data-raw "Tigo-ACC $NEWDATA"

My crontab has this entry:

Code:
* * * * * /mnt/ffs/bin/data-to-influxdb.sh

Hope this helps.
Thanks!

I saw similar code on the German forum. I tried to figure out what they were trimming and why they were padding the end with a bunch of zeros and gave up when it seemed like I couldn't pass the data directly into InfluxDB that was running inside HA.

I did figure out the attributes. They show up as "fields".
 
Back
Top