Hi guys,
Im a long time reader, but first time poster. First, I want to THANK YOU for all of the computer help you have given me over the years. Im basically the tech support guy for my friends and family and without your awesome solutions here, I wouldnt be as smart as I am today.
Anyway, this is something I have been trying to comprehend for a couple years now and I just still cant understand it; thus, I have decided to just try and ask on the forums and make an account. Im pretty sure this question is the equivalent of How much does it cost to build a house because there are so many options/circumstances involved, but perhaps someone might be able to point me in the right direction, or water it down for me so I could understand it.
![]()
Basically, what I want to figure out is a comparison between all of the different interfaces and their speeds (USB 2, 3, FW, etc.) in megabytes per second (when people say megabits per second, I still dont understand that or how to convert that--why do we even say that anyway? Ill save that for another post) when connected to a HD or SSD. I know they all have theoretical speeds, but I think thats just like internet speeds where you never really get close to the real thing, right?
As soon as I try to figure it out, I remember that the speed is going to be different when you have a 5400 or 7200 hard drive or even an SSD. When I throw an SSD in to the equation it seems that for the hundreds of SSDs that are out there, there isnt a standard speed they read/write data at because theyre all different.
For years, Ive known that the bottleneck of a computer is the hard drive, so what exactly is that bottleneck...exactly how fast could a HD max out at read/write speeds---especially across the various interfaces.
Ive been trying to make a chart to keep it all straight.
Im a long time reader, but first time poster. First, I want to THANK YOU for all of the computer help you have given me over the years. Im basically the tech support guy for my friends and family and without your awesome solutions here, I wouldnt be as smart as I am today.
Anyway, this is something I have been trying to comprehend for a couple years now and I just still cant understand it; thus, I have decided to just try and ask on the forums and make an account. Im pretty sure this question is the equivalent of How much does it cost to build a house because there are so many options/circumstances involved, but perhaps someone might be able to point me in the right direction, or water it down for me so I could understand it.

Basically, what I want to figure out is a comparison between all of the different interfaces and their speeds (USB 2, 3, FW, etc.) in megabytes per second (when people say megabits per second, I still dont understand that or how to convert that--why do we even say that anyway? Ill save that for another post) when connected to a HD or SSD. I know they all have theoretical speeds, but I think thats just like internet speeds where you never really get close to the real thing, right?
As soon as I try to figure it out, I remember that the speed is going to be different when you have a 5400 or 7200 hard drive or even an SSD. When I throw an SSD in to the equation it seems that for the hundreds of SSDs that are out there, there isnt a standard speed they read/write data at because theyre all different.
For years, Ive known that the bottleneck of a computer is the hard drive, so what exactly is that bottleneck...exactly how fast could a HD max out at read/write speeds---especially across the various interfaces.
Ive been trying to make a chart to keep it all straight.