visit
We will gather data via SNMP protocol via the package.
Understanding SNMP and MIBs
(Simple Network Management Protocol) is a widely used protocol for monitoring the health and performance of network devices. It allows for the collection of data such as temperatures, CPU usage, and disk status.
(Management Information Bases) are databases of information that can be queried via SNMP. Each piece of data is identified by an OID (Object Identifier), which uniquely identifies a variable that can be read or set via SNMP.
You need to specify the MIB values to gather. I have Synology NAS. They file on their pages. We need to gather:
async def run(server_name, ipaddress, username, passwd, outinfo):
# SNMP walk for disk name, model, and temperature
oids = [ ObjectType(ObjectIdentity('1.3.6.1.4.1.6574.2.1.1.2')), # Disk name (diskID)
ObjectType(ObjectIdentity('1.3.6.1.4.1.6574.2.1.1.3')), # Disk model (diskModel)
ObjectType(ObjectIdentity('1.3.6.1.4.1.6574.2.1.1.6')) # Disk temperature (diskTemperature)
]
errorIndication, errorStatus, errorIndex, varBinds = await bulkCmd(
SnmpEngine(),
UsmUserData(username, passwd, authProtocol=usmHMACSHAAuthProtocol), # Use the appropriate auth protocol
await UdpTransportTarget.create((ipaddress, 161)),
ContextData(),
0, 10, # Increase the max-repetitions to get more results in one request
*oids # Query disk name, model, and temperature
)
if errorIndication:
print(f"Error: {errorIndication}")
elif errorStatus:
print(f"Error Status: {errorStatus.prettyPrint()} at {errorIndex and varBinds[int(errorIndex) - 1] or '?'}")
else:
disk_data = {}
for varBind in varBinds:
oid, value = varBind
oid_str = str(oid)
# Disk name
if oid_str.startswith('1.3.6.1.4.1.6574.2.1.1.2'):
index = oid_str.split('.')[-1]
if index not in disk_data:
disk_data[index] = {}
disk_data[index]['name'] = value
# Disk model
elif oid_str.startswith('1.3.6.1.4.1.6574.2.1.1.3'):
index = oid_str.split('.')[-1]
if index not in disk_data:
disk_data[index] = {}
disk_data[index]['model'] = value
# Disk temperature
elif oid_str.startswith('1.3.6.1.4.1.6574.2.1.1.6'):
index = oid_str.split('.')[-1]
if index not in disk_data:
disk_data[index] = {}
disk_data[index]['temperature'] = value
# Print out the disk information
for index, info in disk_data.items():
name = info.get('name', 'Unknown')
model = info.get('model', 'Unknown')
temperature = info.get('temperature', 'Unknown')
name = str(name)
model = str(model)
temperature = str(temperature)
print(f"IP Address {ipaddress}, Disk {index}: Name: {name}, Model: {model}, Temperature: {temperature} °C")
outinfo.append({'server_name': server_name, 'ip': ipaddress, 'disk': index, 'name': name, 'model': model, 'temperature': temperature})
version: '3.8'
services:
pingchart:
build: .
restart: always
container_name: synology-temperature
Then start the Docker with docker-compose up -d
.
I am affiliated with the - a simple system to gather, process, and visualize data. You send the data there via HTTPS requests (encoded in the URL or in the body) and set up a visualization script there. The graphs are then readily available from anywhere you need.
Here is a csv file. Can you write a code:
Split data into different graphs by combining the server name and name (e.g., DS920+ / Disk 1).
Each graph will show the temperature.
There will be a title in each graph (e.g., DS920+ / Disk 1)
The graphs will have the same temperature range.
The background will be black, graph background will be also black, the graph color will be from dark green (low temperatures) to light green (high temperatures).
There will be two thin lines - 20 °C (blue) and 45 °C (red).
Trim the data for last week with tickmarks at midnight of every day.
The data are in UTC time. Convert it to Europe/Berlin time zone.
The resolution of the total image is h x w 600 x 1024 pixels.
Save the image to PNG.
disk,ip,model,name,server_name,temperature,timestamp
0,10.0.0.9,ST4000VN008-2DR166,Disk 3,DS920+,38,2025-09-19T20:19:48.723761
1,10.0.0.9,ST16000NM000J-2TW103,Disk 4,DS920+,42,2025-09-19T20:19:49.253975
2,10.0.0.9,ST4000VX007-2DT166,Disk 1,DS920+,38,2025-09-19T20:19:49.818734
3,10.0.0.9,ST4000VX007-2DT166,Disk 2,DS920+,39,2025-09-19T20:19:50.393793
0,10.0.2.9,ST12000NM001G-2MV103,Disk 1,DS220j,28,2025-09-19T20:19:50.873142
0,10.0.0.9,ST4000VN008-2DR166,Disk 3,DS920+,38,2025-09-19T20:20:02.119583
1,10.0.0.9,ST16000NM000J-2TW103,Disk 4,DS920+,42,2025-09-19T20:20:02.596654
2,10.0.0.9,ST4000VX007-2DT166,Disk 1,DS920+,38,2025-09-19T20:20:03.101480
3,10.0.0.9,ST4000VX007-2DT166,Disk 2,DS920+,39,2025-09-19T20:20:03.697423
0,10.0.2.9,ST12000NM001G-2MV103,Disk 1,DS220j,28,2025-09-19T20:20:04.221348
0,10.0.0.9,ST4000VN008-2DR166,Disk 3,DS920+,38,2025-09-19T20:25:02.254611
1,10.0.0.9,ST16000NM000J-2TW103,Disk 4,DS920+,42,2025-09-19T20:25:02.714633
2,10.0.0.9,ST4000VX007-2DT166,Disk 1,DS920+,38,2025-09-19T20:25:03.295622
3,10.0.0.9,ST4000VX007-2DT166,Disk 2,DS920+,39,2025-09-19T20:25:03.780728
...
You can use a fully managed 2minlog to gather, process, and visualize the data. Check out the . I display the results on an Android tablet sitting on my table and cycle by the various graphs with . You can also save the data on your local file system and do the same.
#Synology #SynologyNAS #Temperature #Monitoring #DataVisualization #Matplotlib #SNMP #2minlog #Python #Docker