Huge Database

Forum for questions and support relating to the 1.31.x releases only.
bbunge
Posts: 2934
Joined: Mon Mar 26, 2012 11:40 am
Location: Pennsylvania

Re: Huge Database

Post by bbunge »

juanmoura wrote: Thu Feb 08, 2018 1:03 am Alec, I'm sorry but the statement you made is wrong. Come on, my setup basically consists of:

1 Camera, set the resolution in Zoneminder as 640x480, however the resolution on the camera is 1920x1080 ...
In the storage tab this is disabled to save JPEG and enabled H264 Passthroug

That way when I click on the monitor I watch in 640x480, but when it recognizes the motion and records and I open the recorded file it is at 1920x1080p

This way I can save ram memory by putting more cameras.

When I click on the FRAMES tab I do not see a extracted MP4 frame, I see a extracted frame with the resolution of 640x480 even without I see.

What I mean is that, my system has 43 cameras with 12 tb, all 2MP, set as MODETEC. The server started operating in mid-November 2017, and now it is at 52% space in disc.

I was doing a backup of the database to separate from the server using multiservers and I came across this situation, 36gb database, I went back and saw that it was the table of frames storing photos that are unnecessary.

When the zoneminder is operating the resource quantity for mysql is large and the more unnecessary processes the less resources we will have for the system.

In the log I see it creating the images, later creating the .MP4 and later it deleting the JPEG images created, but it backs up inside the bank without necessity.

Why save frames if I already own the mp4 file?

Removing this from the system will improve performance.
You are not going to like this... It looks like you are running a production server on a testing version of Zoneminder. Now you are in trouble and looking for a way out.
Your way out is to go back to Zoneminder 1.30.4 and live with a well performing system. If and when 1.31.x is released it may be safe for you to upgrade. It looks like you are computer savvy so you know of what I write... Sorry for your troubles... Test on something you can afford to loose!
juanmoura
Posts: 91
Joined: Fri Nov 24, 2017 11:46 am

Re: Huge Database

Post by juanmoura »

I understand, in fact I'm testing on VMs, for testing and when I see it's up to date I update. The observation I quoted was (as I see it) there is this error that maybe no one has observed, so I'm just scoring for a future fix, in a way that I quoted I believe would greatly optimize the system.
Something I should say, the system the community is building is fantastic.
User avatar
zd59
Posts: 102
Joined: Wed Jan 18, 2017 1:39 pm
Location: EU - Slovenia

Re: Huge Database

Post by zd59 »

I compiled a GIT sources on Slackware, updated ZM database structure from zm_update-1.31.1.sql (included in GIT source).
Can not confirm juanmoura claims.
Frames table can not be huge. Its structure is:
Id - int(10)
EventId - int(10)
FrameId - int(10)
Type - enum('Normal','Bulk','Alarm')
TimeStamp - timestamp
Delta - decimal(8,2)
Score - smallint(5) unsigned

As of the above, there is no binary (blob) fields to save JPGs.
Checked also with h.264 passthrough stream recording.
7MB event video did not mirrored in frames table, which did not notably grow (from one 900KB video, added with this 7MB one). Its still 688KB in size.
Locked