100 chainstore IP cameras project - Please advise

Support and queries relating to all previous versions of ZoneMinder
Post Reply
adahary
Posts: 14
Joined: Fri Jun 29, 2007 2:15 pm

100 chainstore IP cameras project - Please advise

Post by adahary » Thu Nov 15, 2007 10:58 am

Hi,

I have ZM up and running with 10 IP cameras during the last months just for learning and testing. The cameras are located each on its own remote ADSL site. Now, I'm pretty convinced that ZM is my system to go on production.

I read many discussions in the forum to figure out how to build my ZM server to support my project, but couldn't yet put all pieces together.

I would like to share my project with the forum and hopfully get the best help to make it happen.

Project requirements:
- Managing chainstore of 100 locations each has 1 IP camera linked with ADSL of 150K/1.5M (at least). All cameras should function in Monitoring and Modect-OR-Recording modes.
- Each store should watch only its own camera (separated users).
- Assuming 10% of users watch concurrently.
- Clear picture is a must (320*240)
- Nearly smooth motion view (at least 3 fps)
- Keeping 48 hours of each camera

Calculations:
Total fps = 100*3fps= 300fps
- RAM = 320x240*3byte*(RingB=10)*(cameras=100)+(OH10%) = 2.5GB
- BW pulled from cameras = (cameras=100)*(20kbyte=image size)*3fps*(8bits)/(1024*1024) = 46Mbps
- BW pushed to 10% viewers = 10% * 46Mbps = 4.6Mbps
- Disk 48hrs/camera= (cameras=100)*(20kbyte=image size)*3600*48 = 965GBytes

According to the above calculations the ZM Server should be equipped with:
Dual CPU Xeon or Due2Core
RAM = 4G
Disk SATA 1TB
1Gbps network

To reduce CPU load and BW (pushed) I was thinking to remove the montage view and replace it with an activeX view which will force the users to pull the live stream directly from the camera itself.

I would like to get any comment/recommendation/correction/advise on this setup.

Regards

User avatar
ammaross
Posts: 61
Joined: Mon Mar 12, 2007 8:34 pm
Location: Utah, USA

Post by ammaross » Thu Nov 15, 2007 8:17 pm

I currently operate a 220ish FPS IP-based ZM server pulling from 9 locations that are on ADSL (256k up). I pull 320x240 color at about 1 or 2 fps before the DSL is strained. Pulling B&W will spare some of that bandwidth, so you should get 3fps steady. To save your bandwidth though, you would most likely be required to implement an ActiveX or the like to force your users to view their local camera. You may notice a drop from 3fps if your users need to review the recordings however.

Also, if each user only has one camera assigned to them, they will not have a montage view option, so no need to worry about changing that. And, if you do the local view thing, you won't have to worry about that 4.6Mbps due to 10% user viewing.

My server is running a QX6600 Core2Quad and runs a load around 2.2ish (out of 4). You'll be running B&W which will cut your CPU load, but I would still recommend a slightly slower quad core (Clovertown or Core2Quad) over a semi-faster dual-core for the same price.

Another note, if your system will be writing 100-odd jpgs to the hard drive per second, you might want to consider multiple smaller hard drives. This will improve your write throughput and also seek for playback. Also, in the event of a hard drive failure, you won't lose ALL of your video, just what was being saved on that particular drive. One thing to watch out for though, is I heard the PurgeWhenFull filter has problems with multi-drive environments.
You could consider using a RAID setup, but you only get marginally better write throughput for the redundancy, and it really kills your read-seek and write-seek as you add more disks.

(disclaimer: the above is my opinion which is subject to change without notice. :wink: )

User avatar
Lee Sharp
Posts: 1069
Joined: Sat Mar 31, 2007 9:18 pm
Location: Houston, TX

Post by Lee Sharp » Fri Nov 16, 2007 4:48 am

I am all analog, so keep that in mind. My big box is 16 BT878 cards, real time, so 480fps max. It is on a mid grade Core2Duo with 2 gig of ram and a 500 gig HD. I get 3-5 weeks with modetect.

Flasheart
Posts: 345
Joined: Thu Jul 06, 2006 2:27 pm

Post by Flasheart » Fri Nov 16, 2007 7:26 am

Obviously IP cams. I recommend Axis 205 or 206's if light level is good. Might be worth negotiating for discounts at that volume

I'm thinking that 46Mpbs (plus 10-20% to allow several viewers to check playback) will need a commercial internet link. 320x240 really isn't great - it might be worth asking your customer to go the extra mile to 640x480. Identifying faces on 320 is often more guesswork than anything else.
Bandwidth costs will be a fair bit there.

Redundancy; Instead of trying to shoe all that only one monster box, why not have 2 or 3 lesser specced servers with the load split. Each can also mirror each other's config and if you design enough spare capacity in, you could switch the cameras to another of the boxes if one failed. (Handy if they're mounted remotely from you)

Sounds like quite an interesting project!

adahary
Posts: 14
Joined: Fri Jun 29, 2007 2:15 pm

Post by adahary » Fri Nov 16, 2007 12:11 pm

ammaross,
I currently operate a 220ish FPS IP-based ZM server pulling from 9 locations that are on ADSL (256k up).
what IP camera r u using? how much bw is utilized over the 256k ADSL link when u pull 1 or 2 fps with 320x240 pics?
if your system will be writing 100-odd jpgs to the hard drive per second, you might want to consider multiple smaller hard drives
Could u please direct me to a 'how to' setup multiple disk with ZM.

I think that I should consider switching all monitors to modect when stores are closed. By that I'll reduce the cameras concurrent disk writing down to top 10 out of 100. I guss that this should resolve the DISK issue

Thanx for you help

adahary
Posts: 14
Joined: Fri Jun 29, 2007 2:15 pm

Post by adahary » Fri Nov 16, 2007 12:25 pm

Flasheart,

I was thinking to setup a 640x480 image but that will kill my ADSL upload.
I've read that axis cameras can limit the image bw cost; do u know how low axis can narrow an image of 640x480 by bw cost?

The server is hosted in an ISP server farm so BW is a matter of cost which is priced per camera location. On the other hand, shelf space is priced per U so several servers will increase the monthly cost by 100x$$$ and eventually increase cost per camera location.

Regards

User avatar
ammaross
Posts: 61
Joined: Mon Mar 12, 2007 8:34 pm
Location: Utah, USA

Post by ammaross » Fri Nov 16, 2007 8:38 pm

adahary wrote:ammaross,
I currently operate a 220ish FPS IP-based ZM server pulling from 9 locations that are on ADSL (256k up).
what IP camera r u using? how much bw is utilized over the 256k ADSL link when u pull 1 or 2 fps with 320x240 pics?
I have a mixture of Axis 2100s, Axis 210s, and Intellinet cameras. Of the three, I would recommend the 210s (very clear picture). All of these cameras are indoor. We have VPNs implemented from the remote sites to the admin office, so there's a touch of overhead there. I experimented with what would clog the upload for the DSL connections, so I tried to pull 5fps, but could only manage maybe 2.2fps with any consistancy. Since we have other priority traffic that connects back to our admin office, I had to throttle the fps on ZM's side, as well as put a max bandwidth in the camera settings. I'm sure if I was passing purely black-and-white images, I could get more fps though. Also, as a note, 640x480 over a DSL would be terrible. We have considered deploying some small servers (old machines equiped with Ubuntu or the like) which would record the cameras locally and then offload the actual event data to a main server. This would eliminate the need of having a constant 3fps going out of the site, just to get to the main server and realize there's no movement. However, this method would require some restructuring of the ZM database a bit...or at least some personally-written utility.
adahary wrote:
if your system will be writing 100-odd jpgs to the hard drive per second, you might want to consider multiple smaller hard drives
Could u please direct me to a 'how to' setup multiple disk with ZM.

I think that I should consider switching all monitors to modect when stores are closed. By that I'll reduce the cameras concurrent disk writing down to top 10 out of 100. I guss that this should resolve the DISK issue
I haven't attempted to implement a multiple-disk installation yet (due to lack of high-throughput requirements) and have resorted to the lazy, linux-enabled software raid (got to hate it if your hardware that runs a proprietary hardware-assisted software raid dies). I'm no Linux expert (important think to remember), but I think one could set up a multi-disk system by setting up all 100 cameras, and then deleting the folders created in zm/events and replacing them with (soft/hard?) links to a folder set up on each drive.
ie:
  • zm/events/1 -> /mnt/disk1/1
    zm/events/2 -> /mnt/disk2/2
    zm/events/3 -> /mnt/disk3/3
    zm/events/4 -> /mnt/disk4/4
    zm/events/5 -> /mnt/disk1/5
    zm/events/6 -> /mnt/disk2/6
    etc
The camera names are soft-linked to the zm/events/<monitorid> folder, so those shouldn't need to be changed.
So, in theory, this would spread disk writing across 4 drives, segmented how you want. Maybe the first 25 cameras on disk1, next 25 on disk2, etc. You could even use a small 40gb or so to host the OS on too keep MySQL/HostOS access seperate too. Granted, 4 drives would probably be overkill at only 25 cameras each drive, but if you've ever tried copying multiple large files in Windows, you can see just how quickly a hard drive can get bogged down.

Do look into modding the monitor view to stream straight from the local camera though (at least for non-admin users), since that will save you on bandwidth from your server. I know hosted locations nail you to the wall for excess traffic.

Flasheart
Posts: 345
Joined: Thu Jul 06, 2006 2:27 pm

Post by Flasheart » Sat Nov 17, 2007 8:48 am

adahary wrote:Flasheart,

I was thinking to setup a 640x480 image but that will kill my ADSL upload.
I've read that axis cameras can limit the image bw cost; do u know how low axis can narrow an image of 640x480 by bw cost?
Yep - Axis' web configuration allows you to specify default image size, colour, fps-per-client, jpg compression (quality), light level and lots more.

In addition, there is a wide variety of url-specified switches to override those settings for individual feeds.

eg: url_to_camera/path/pic.jpg?color=0 will return a greyscale picture regardless of global settings. Also fps, resolution, quality can all be specced on the command line. 640x480 in b&w is probably only a little larger than 320x240 in colour and gives far more clarity.
The server is hosted in an ISP server farm so BW is a matter of cost which is priced per camera location. On the other hand, shelf space is priced per U so several servers will increase the monthly cost by 100x$$$ and eventually increase cost per camera location.
That's a good point I hadn't considered. Also they can probably manage backups for you.

Flasheart
Posts: 345
Joined: Thu Jul 06, 2006 2:27 pm

Post by Flasheart » Sat Nov 17, 2007 8:54 am

ammaross wrote:I'm no Linux expert (important think to remember), but I think one could set up a multi-disk system by setting up all 100 cameras, and then deleting the folders created in zm/events and replacing them with (soft/hard?) links to a folder set up on each drive.
You don't even need to do that now - Zoneminder "Paths" config section allows you to specify the data paths via the web config. They can be anywhere on the mounted filesystem, even remote network shares (Though I don't think that would be a good idea for speed!)

So you'd just make the partition as you wish, create the top level paths (eg: /data/images /data/events ) - grant the right permissions and restart ZM.

If there are existing database entries it's a little more complicated.

You need to STOP ZM
Move ALL the data across to the new directory structure, preserving its layout.
Change the path in ZM config.
Restart ZM

If you attempt to change, restart ZM and then move the data after - Zoneminder will notice the data isn't there are delete the corresponding database entries, nuking everything. (Don't ask me how I know this... :) )

User avatar
cordel
Posts: 5210
Joined: Fri Mar 05, 2004 4:47 pm
Location: /USA/Washington/Seattle
Contact:

Post by cordel » Sat Nov 17, 2007 9:05 am

Actually moving events is a little more involved, the path provided in the GUI does not get processed by apache so moving events, if they are moved outside of the configured web root in apache are not accessable through the web interface.

The easy way to get around this is to create a link in the web folder to the new location, but this might still require you to configure apache to follow sym links for that folder, although this is well documented at the apache web site and it is the simplest.

User avatar
ammaross
Posts: 61
Joined: Mon Mar 12, 2007 8:34 pm
Location: Utah, USA

Post by ammaross » Mon Nov 19, 2007 4:29 am

Flasheart wrote:
ammaross wrote:I'm no Linux expert (important think to remember), but I think one could set up a multi-disk system by setting up all 100 cameras, and then deleting the folders created in zm/events and replacing them with (soft/hard?) links to a folder set up on each drive.
You don't even need to do that now - Zoneminder "Paths" config section allows you to specify the data paths via the web config. They can be anywhere on the mounted filesystem, even remote network shares (Though I don't think that would be a good idea for speed!)

So you'd just make the partition as you wish, create the top level paths (eg: /data/images /data/events ) - grant the right permissions and restart ZM.
The problem is, PATHS only allows you to specify the events directory. The recommendation I was making was how to distribute the individual monitors in the events folder across multiple hard drives, thus maintaining a semi-decent write/read performance. This is where Cordel's sym (soft/hard) links come in, but for monitors rather than the events folder itself.

Flash_
Posts: 441
Joined: Wed Jan 11, 2006 12:19 pm

Post by Flash_ » Mon Nov 19, 2007 7:17 am

Ah sorry - i misunderstood. The only downside with that is that the PurgeWhenFull and ZM's on-screen disk-usage reporting will be incorrect. The former could cause some problems as it'll report the du on the mount drive, not any subsequent symlinked mounts and won't reliably fire if one of those becomes full.

Could probably be overcome with some external scripting.

Centry
Posts: 1
Joined: Mon Nov 19, 2007 2:25 pm

Post by Centry » Mon Nov 19, 2007 2:53 pm

Hi to all.

I also want to build something similar for small number of locations. The ISP is a nice idea, but I think that if you have a good broadband line, you can place zm in home/office and save the costs.

I wonder how it goes and works adahary.

Post Reply

Who is online

Users browsing this forum: No registered users and 4 guests