Originally Posted By: Dignan
Do you have a specific tool for this purpose?

I used iperf.

One side runs a server:
Code:
[tman@ideal ~]$ iperf -s
------------------------------------------------------------
Server listening on TCP port 5001
TCP window size: 85.3 KByte (default)
------------------------------------------------------------


The other side runs the client:
Code:
[tman@storage ~]$ iperf -c ideal
------------------------------------------------------------
Client connecting to ideal, TCP port 5001
TCP window size: 16.0 KByte (default)
------------------------------------------------------------
[  3] local 192.168.1.20 port 44971 connected with 192.168.1.50 port 5001
[ ID] Interval       Transfer     Bandwidth
[  3]  0.0-10.0 sec  1.08 GBytes   928 Mbits/sec


Two machines which are connected via GigE to the same switch. It doesn't hit the disk at all for anything so it should be fairly close to the theoretical maximum for that link with those parameters assuming nothing else is using the network.

One side is using conventional PCI with only a single link so if I run a dual ended test where it tests speed in both directions at the same time, I get limited to 430Mbps + 638Mbps which is 1068Mbps and for some reason it uses 1000 instead of 1024 when calculating bps so its actually 1018Mbps which is just under the 1064Mbps (133MBps) that a 32 bit 33MHz conventional PCI card can do.

Code:
[tman@storage ~]$ iperf -c ideal -d
------------------------------------------------------------
Server listening on TCP port 5001
TCP window size: 85.3 KByte (default)
------------------------------------------------------------
------------------------------------------------------------
Client connecting to ideal, TCP port 5001
TCP window size:  487 KByte (default)
------------------------------------------------------------
[  5] local 192.168.1.20 port 55051 connected with 192.168.1.50 port 5001
[  4] local 192.168.1.20 port 5001 connected with 192.168.1.50 port 47713
[ ID] Interval       Transfer     Bandwidth
[  5]  0.0-10.0 sec   513 MBytes   430 Mbits/sec
[  4]  0.0-10.0 sec   762 MBytes   638 Mbits/sec


The other side is 4 lane PCI-e but the only other machines I've got which are PCI-e all run ESX so I'd need to run it on the service console.


Edited by tman (16/01/2011 23:47)
Edit Reason: 1000 > 1024 for iperf mbps