1GB Hard Disk Partitioning for Speed - what size and NTFS Mount points

DaveDave

New member
Hi
I'm just about ready to change my quote to an order (just hoping for some comments in the Check this Spec forum).

As well as a boot SSD I have a 1GB WD Caviar Black as the main work drive and want to partition it for "short stroke" speed.

Anyone know, have an HDTune performance chart perhaps, where the rd/wr performance starts to drop off as you measure across the disk?

I'm thinking that if I use a 200GB partition I'll get good speed out it, and then I may well split the rest into 400GB and the rest (another 400GB).

Any thoughts?

Also, anyone have any experience with NTFS mount points? I'm hoping to mount the 2nd partition mentioned above in a folder in the first partition - thus I can access both partitions as one drive (letter) but have "two-speed" data within it. Anyone tried this, know of problems etc (backup?)?


Cheers
 

ubuysa

The BSOD Doctor
Hi
I'm just about ready to change my quote to an order (just hoping for some comments in the Check this Spec forum).

As well as a boot SSD I have a 1GB WD Caviar Black as the main work drive and want to partition it for "short stroke" speed.

Anyone know, have an HDTune performance chart perhaps, where the rd/wr performance starts to drop off as you measure across the disk?

I'm thinking that if I use a 200GB partition I'll get good speed out it, and then I may well split the rest into 400GB and the rest (another 400GB).

Any thoughts?

Also, anyone have any experience with NTFS mount points? I'm hoping to mount the 2nd partition mentioned above in a folder in the first partition - thus I can access both partitions as one drive (letter) but have "two-speed" data within it. Anyone tried this, know of problems etc (backup?)?


Cheers

If you're looking for performance rather than data organisation then don't partition at all. Get a copy of Ultimate Defrag from http://disktrix.com/, it's not free but it's the best $29.95 you'll spend. It's an optimising defragger which lets you place your critical high performance files close to the 'faster' (short-stroke) outer edges of the disk and close to the MFT whilst placing your little used 'archive' data on the slower inner tracks out of the way. The file selection capabilities are powerful and it's by far the best way of getting the best performance from a big hard disk. I've used it on my HDDs for years and I highly recommend it.
 

DaveDave

New member
Thanks re Ultimate Defrag, I can see it might allow me the "per-folder" control I was thinking of. I'm assuming that its monitoring overhead is not significant and it is reliable!?
 

Wozza63

Biblical Poster
The Auslogics Defragger has an optimizer built into it, this will put your most used files to the centre of the disk. It takes a while to do, especially first time. It's also completely free! Just make sure you dont use it on the SSD.
 

ubuysa

The BSOD Doctor
Thanks re Ultimate Defrag, I can see it might allow me the "per-folder" control I was thinking of. I'm assuming that its monitoring overhead is not significant and it is reliable!?

It's a one-off tool, you run it to reorgnise your data and that's it. Until the next time you need to defrag of course. It's not a real-time tool so there is no overhead.
 

ubuysa

The BSOD Doctor
The Auslogics Defragger has an optimizer built into it, this will put your most used files to the centre of the disk. It takes a while to do, especially first time. It's also completely free! Just make sure you dont use it on the SSD.

That's not necessarily where you want them, it depends on whether you move the MFT and the directories to the centre as well. Basically you want the MFT, directories, and all your high performance files close together. Putting them on the early tracks allows for expansion into the disk without a major performance impact, putting them in the centre doesn't do that, in fact it encourages long seeks when the files become fragmented. I wouldn't recommend that for best performance.
 

Wozza63

Biblical Poster
Yes that is what I meant, sorry. By files I meant everything, not just user files so whatever required high performance most is put towards the center.

The program performs defragmentation at the same time so that files aren't fragmented and slow down the loading times.
 

ubuysa

The BSOD Doctor
Yes that is what I meant, sorry. By files I meant everything, not just user files so whatever required high performance most is put towards the center.

The program performs defragmentation at the same time so that files aren't fragmented and slow down the loading times.

You mean on the inner tracks (close to the spindle) when you say towards the centre?

On hard disks the data density is lower on the outer tracks and higher on the inner tracks (because the track length is longer on the outer tracks). Some hard disks are designed with more sectors on the outer tracks to take advantage of the longer track length. Although it varies from disk to disk, in general the faster tracks are the outer ones, so I have always placed my critical, high-performance data there. The inner tracks I use for the archive data that I rarely access but need to keep on the disk.
 
Top