log in | register | forums


User accounts
Register new account
Forgot password
Forum stats
List of members
Search the forums

Advanced search
Recent discussions
- Sunday coding session with Gerph on Sunday (News:1)
- Rougol July 2024 meeting on monday (News:1)
- Livestream coding session with Gerph this sunday (News:2)
- WROCC July 2024 meeting - Draw/ROD double bill (News:)
- WROCC July 2024 meeting on... Hughes and Peter Richmond (News:)
- July developer 'fireside' chat is on saturday night (News:)
- June 2024 News Summary (News:)
- Gerph's live coding session on Youtube (News:4)
- Rougol June 2024 meeting on monday (News:1)
- WROCC June 2024 meeting - Mark Moxon dissects Lander (News:1)
Related articles
- Wakefield 2003 - the preview
- Wakey Wakey, it's show time again!
- Iyonix: first birthday review
- Wakefield Show - Saturday report [updated]
- RISC OS 2002 show
- NET100 card announced
- RISC OS 2001 show
- Firefox released for RISC OS 5 [Updated]
- Review - A9home
- Castle Up For Grabs
Latest postings RSS Feeds
RSS 2.0 | 1.0 | 0.9
Atom 0.3
Misc RDF | CDF
View on Mastodon
Site Search
Article archives
Acorn Arcade forums: News and features: 100bT Network Cards Tested

100bT Network Cards Tested

Posted by Richard Goodwin on 01:00, 20/6/2002 | , , , , , ,
This is a report we've obtained about tests done on the Simtec Net100 card and the Castle Net20 cards.

Westborough Testing Labs

Tests performed : 17/5/2002
Title : Network card testing.
Report by : RL, VS and BD
Notes : Results re-verified 13/6/2002


All tests are to be performed using the same Risc PC.
The machine spec is

  • a standard issue 1 RPC 600
  • 20Mb Ram
  • 2Mb Vram
  • RISC OS 4.02
  • Internet module 5.06
The servers used
  • Dual P3 733MHz
  • Linux 2.4.18
  • Kernal NFS
  • Samba SMB
  • Raid array (conservative local throughput 88 MB/sec+)
  • Network card is Ether Express Pro100
  • 1G Ram
  • 1.4GHz Athalon
  • Windows 2000 + all critical updates
  • Local Disc Throughput greater than 23Mb/sec
  • 3Com 3C905C
  • 512Mb Ram
  • For 100Meg a FS108 100Meg Switch was used
  • For 10Meg a 3Com Linkbuilder FMS 2 was used
Sw used is
  • NFSClient 1.18(23rd jul 1999)2.44b (imagenfs)
  • Lanman98 1.20 (20th Dec 2001)
  • !FSSpeed by Garry Partis (Morley Electronics)
Test cards
  • Castle Net20, driver EtherY 0.50
  • Simtec NET100, driver EtherX 1.10


The tests were intended to be repeated 3 times to prevent single test error.
Test Results were to be rounded to the nearest whole 10K/S as accuracy greater than this could not be guaranteed.
The machine was to be clean rebooted between testing the protocol and speed steps.
The two protocols were chosen to test real world activity on both TCP/IP and UDP/IP performance.
NFS was to use the standard 8K UDP packet size. SMb was to use the standard TCP settings.
The servers were both chosen ant tested to have the ability to completely fill a 100Megabit FDX network, hence their performace can largely be discarded from the tests.
The tests to be performed were:

  • Four filesizes 1,4,16,64Mb
  • Two protocols SMB(tcp/ip) and NFS(udp/ip)
  • Network Speed 10Mbit HDX and 100Mbit FDX/HDX
  • 2 CPUS in the RPC 33Mhz ARM 610 and 233Mhz Strong Arm
  • 2 Server OS Linux and windows (NFS could not be performed)
  • 2 Cards to test, the Castle Net20 and Simtec's NET100


An initial "control" test was performed with a linux client worstation using the 100Meg FDX switch against both server OS, this constantly archived over 10000K/S very near the maximum possible on the 100Meg network. This confirmed our servers were operating correctaly.
The RPC 600 was configured with the Castle Net20 podule, the strong arm CPU, 100Mbit FDX network and the SMB transport against the Linux Server.
Initially extreme difficulty in getting the net20 to start the interface in 100Mbit Full duplex was encountered. When these were overcome the tests could begin.
The first test was run and immediately a problem was encountered, as the transmit speed was an aceptable 1040K/S but the receive speed was 170K/S, this seemed constant over the four test sizes.
We proceeded to our second protocol test on NFS and were shocked to discover the test would simply not complete.
We decided to continue the test at 10Megabit. The network was reconfigured accordingly to 10Mbit HDX and tested with a Linux server which achived 890K/S which is reasonably close to the limit on an unswitched 10Mbit network.
The Castle card performed adequately turning in a consistant 660K/S write and 880K/S Read. Which compares to an I-Cubed NIC which returns 590K/S write and 650K/S read in the same test. Or a Riscstation which returns 670K/S write and 780K/S read.
The formal test protocol was halted at this point in an attempt to find the error (we were unwilling to accept these results).
We decided to switch to the Simtec NET100 to see if the 100Meg performance issues were common to both.
The RPC 600 was configured with the Simtec NET100 NIC, the StrongARM CPU, 100Mbit FDX network and the SMB transport against the Linux Server.
No issues with selecting 100Meg FDX were encountered. The card auto detected the link correctly and immediately selected 100Meg FDX operation.
The first four filesizes were tested and resulted in 950K/S write and 1400K/S read which immediately discounted the read performance problem being with the test configuration.
The NFS tests were performed and resulted in 900K/S write and 1300K/S read slightly lower than the SMB performance but still acceptable - at lest the tests completed!
We then proceded to the 10Meg tests which yeiled results of 630K/S write and 850K/S read.
The NFS tests were again a little lower at 600K/S write and 800K/S read.
We performed the SMB tests on both cards against the Windows 2000 server and recived very similar results to the Linux tests from both cards.
The ARM 600 tests were omitted due to time constraints however some informal tests indicate that performance drops by around a 1/3rd at 100Meg operation.

Package Quality

The hardware build quality of the Castle card is to an adequate standard although the lack of diagnostic LEDs on the card's rear seems an unnecessary cost saving and the general quality of the metalwork is low.
The hardware build quality on the Simtec card is to a very high standard with an innovative low chip count design. The diagnostic LEDs are helpful in card configuration and the rear metalwork is of a very high quality finish.
Bundled software on the Castle card consists of the device driver and a copy of !Boot to merge to allow detection of the card by the RISC OS configuration utility, but little else was provided. It is noted that the auto-detect code fails if there is no Castle card present.
Bundled software in the Simtec card includes a bootp client and a user friendly firmware upgrade tool which appears to allow for half a megabyte of modules to be placed in the cards EEPROM. The software bundle on this card has a generally more profesional feel.


This comparison cannot be considered complete due to the faliure of the Castle card to complete the tests and the grossly anomalous results on 100Megabit operation compared to the Simtec card on test.
We can only assume that there is an issue with driver software of the Castle card, as the card firmware is changeable this should be, in our view, corrected by Castle at soonest opportunity.
The Simtec card completed all the tests and performed to an adequate standard, it is our belief that the card is limited by either CPU availability or a scalability issue of the networking within RISC OS. Some informal tests using alternate OS with the card yield results which re-inforce this opinion.
In general conclusion, for 10Megabit operation either card is adequate, the Castle card performing very slightly faster, although the ease of setup on the Simtec card outweights this benefit. For 100Megabit operation we can currently only recomend the Simtec card until the driver issues with the Castle card are corrected. Although due to the favourable software bundle the Simtec card seems to offer an all round better solution.

Log in to comment on this article

Acorn Arcade forums: News and features: 100bT Network Cards Tested