These two pics (of relatively bad quality) should convince you to clean your laptop more often than once per 5 years. On the first pic you see the dust nest on the processor fan. On the second pic you see a heavy layer of dried termogrease – might be, it still has some healing effect but certainly no cooling capabilities are there.
Well, actually this is not building plans but a plain ideology. Let’s call it the architectural view 😉
One of most crucial things in the world is to listen to your own thoughts. To understand why you did this or that you need to listen to yourself extra carefully 😉 In some cases, you will have the official reasoning and then, your own inner reasoning (“it’s cool” etc).
In my particular case, it took approximately a month for me to understand, why I built the particular kind of NAS which I actually built and not some another variety.
Below I describe anything I was able to recall, remember and recognize.
1. I was in a relative hurry. My previous 1TB one-disk “NAS” started to develop disk errors.
Yes, I have a backup, but my backup was made (in a hurry again) onto a temporary setup involving 8 RAID-6 250G disks (on the picture) which disks, in turn, were 5 years old (=36000 working hours). This way, I needed some imminent solution to continue with my data-rich personal life.
On the picture you see some temporary setup while copying files from my old 8x250GB disk RAID-6 “backup” to a new NAS. The source device is a Compaq ML-350. The fan block contains 3 fans built together. rsyncing some 850+GB of data lasted approx 40 hrs.
2. Some of my files were stored on a single instance on separate 500G and 1T external USB disks. Oldest of these disks were bought 4-5 years old. Nothing critical but still 😉
3. I am relatively paranoid. Actually I haven’t lost much data in my life. One famous case was in 1996 when I had made some “space error” in a Linux command (rm -rf a *) and I lost the result of my 2 days work , but the filesize was still under 100k.
Around 2003, playing with an 40G encrypted disk, I messed up my whole MP3 archive. 98% of files were recovered, but some of these from lost+found directory. Believe me, ID tags were not very popular then. This way, now I have some albums from unknown groups as music without the name 🙂
4. One of the biggest risks I recognize in SoHo sector, is the quality of the AC power. During my life, I have experienced 3-4 lesser extent adventures with my data and always the reason has been some home grade equipment (Asus slim micro case or smth) multiplied to several consequtive AC outages. My experience is, the quality of professional servers is much more better – their boot sequence is slower, PSU’s are built considerably better (component quality!). The only problem with the old retired professional equipment is the cost of power. This way, I am really eager to pay a little higher electricity bill. My current NAS has 2 separate PSU’s which can provide the current during short 1-2 sec blackouts. Impressive! (Somedays I will write an article about UPSes.)
5. The learning curve. I better take a steeper learning curve than pay profit margin to Synology. Probably this explain best, why I have made up my mind in favour of FreeNAS 9.x and ZFS.
6. The usual cancer in SOHO sector – you are only willing to buy 1-2-3 disks per month and not to spend half or whole of your salary on some more professional solution.
7. Easy availability of some retired equipment (5-8 years old). This is somehow related to my daily work. I am willing to collect and carry away any heavy steel box.
8. Easy availability of several retired SATA controllers … the main limitation being 2TB barrier.
9. 100MBit network installation at my home. Of course I will soon upgrade it to 1GBit. But not yet. The issue is – I have several VLANs and some security features, thus I need professional managed SNMP-capable switches. Due to the topology I need at least two such switches which could mean some 200-400€ resulting cost.
10. My decision not to jump over the 2TB barrier this time. Partially it was due the former reason (availability of old pro controllers), partially to the fact that 4TB hard disks are relatively new and their reliability is yet unknown. It takes much more care with any (expect the lastest) hardware to flawlessly use things like UEFI boot, new partitioning programs etc . While some of my motherboards support UEFI, they are of earliest brand and have some glitches.
Now some negative decisions, i.e. things that I avoided … even if involved some extra costs or risks.
11. Purchasing some new and beautiful SATA enclosures? No! These would dictate a new case. The verdict – cost prohibitive.
12. Buying a new modern motherboard? No! This would dictate some newer processor and plenty of memory (=cost!). The same verdict. What is the use of the speed offered by pricy SATA-3 ports if my network is not supporting 1Gbit speed?! In addition to that – most SATA RAID controllers I have in my disposal, are PCI-X, which is not supported on most SoHo-grade Mobos.
13. ECC memory which is a must while using ZFS, is not available for most SOHO Mobos. While it was relatively cheap and available for the chosen HW.
14. I had to choose between the three flavours of RAID controllers – a single 8-port 3ware card without BBU (and no idea where to get a replica for disaster recovery), a single P400 and a 8-port P600 (several of these in my shack – which means I can replace it in case of a disaster). I chose P600 for the reasons of reliability and BBU present, which means, I traded off some speed (which was reluctant with my old HW) and I lost the S.M.A.R.T. capability – P600 only has RAID0 mode, no JBOD mode. While cciss drivers in theory are supporting S.M.A.R.T., it could be very crucial to achieve it under FreeBSD and USB boot.
14. Leaving the S.M.A.R.T off the equation is not a death sinn with ZFS, but it surely will limit the usefulness of the setup. I do not expect more than 3 years of lifetime for the solution.
My current NAS:
- is based on HP Proliant ML-350 g4p hardware (ILO capability, PCI-X busses, brilliant cooling, noise)
- supports up to 8 disks (with 6 self-made steel caddies to replace originally missing SCSI disks)
- FreeNAS-9.2.0-RELEASE-x64 on a single Intel(R) Xeon(TM) CPU 3.00GHz (I was not able to find dual-processor model) and 8GB of ECC memory (later addition: upgraded to 12GB, memory modules obtained from e-bay)
- is currently using ZFS RAIDZ-1 setup with 5 WD 2TB Red disks
My neartime plans:
- to sort all my fails and erase duplicates. I expect to free some 1/3 space from my stored archives.
- meanwhile, to purchase 2 pieces of 4TB Toshiba drives (to support biodiversity 😉 )
- later this year, build a backup server and set up automatic replication of ZFS snapshots.
- later this year, upgrade my (managed!) home network from 100Mbit to 1Gbit.
My long term plans – within 2-3 years:
- build a new, more relevant NAS (with a more precise balance between power vs speed)
- probably using 2,5″ disks and professional caddies.
Previously we talked about it, how to substitute the BBU batteries for HP SmartArray P400. Today we show the same regarding HP P600 RAID controller. This is how the battery (weel, accumulator) looks like this:
There are several choises: purchasing a new battery from Ebay, substituting charge elements inside of the battery, or our way – add external accumulators. Very naturally, doing this invalidates any real or imaginable warranties from HP. Here you see what is inside of the battery corpus after you take it apart:
You should take the original elements out of the battery:
Most important task is to find some suitable batteries for the substitution. We took the risk and purchased accumulators with twice as much of the capacity as the original ones. Our source for accumulators is http://www.ristart.ee :
Next step is to prepare a PCB which will carry the the golden contacts. Contacts are separated from the original accumulators by means of a sharp knife. The resulting PCB will look approximately this:
The whole building block is to be hardened by means of the glue gun:
A P600 controller actually needs two batteries, thus the resulting buildup will look like this:
This is how the P600 looks like after the modification:
It is neccessary to find a suitable place for the batteries inside of the computer case:
Use two-sided tape to fasten the batteries:
This is the resulting picture:
Last but not least – power on the computer and let batteries load. On the picture below, batteries are still empty and thus the cache is not usable yet:
It’ll be wise to update your controller firmware. Unsuccessfully I cannot give any hint here, because HP has recently decided to bill the access to upgrades.
Vaat siis, mis oli karul kõhus!