 
   
  |  | Embedded Systems Engineering  |  | 
These are my own personal views and not those of my company Phaedrus Systems. www.phaedsys.com which is where the full   version of this column resides under the Technical Papers button.
          
        Having   had a break from the column, life has been busy, I thought I would cover   something controversial but first my annual suggestion that everyone takes a   break from the pressures of work. See the other columns on the web site about   that. Summer 2005 column, summer 2004 Column
        
        However, it bears   repeating, that the mind as much as the body needs time to recharge. The head   master at my grammar school always used to end his summer holiday address to the   school with “…read a good book”. The other adage, “a change is as good as a   rest”, reminds me that a change of environment is good too. So a day at ESS in   October at the NEC Birmingham will also do you the world of good. The   bookings are way up for this point in the schedule and the show looks like being   bigger than last year. Well worth a day out, just to get ideas.
        
        I did   some seminars earlier this year (With ARROW on ARM) and it was surprising how many of the   Engineers, some quite experienced, did not know they did not know (ref1) some of the options in the way of MCU, tools and   software such as RTOS, other sw components and methods. So a trip to ESS will be a day   well spent as you don’t know what you don’t know. I will have a stand there and   of course I will have decent coffee again, bring your own   doughnuts, come and have a chat but please visit all the other stands   as well. I got in to trouble last year as one or two people thought my requests   to come to the show and stop at my stand was giving me an unfair commercial   advantage… I did not think that people would come to the show just to see my   stand but would also wander round all of the stands. It is worth doing that   anyway as you never know who or what might be there. 
        
        Recently I have   managed to get into a bit of an argument over the GCC compilers. The aficionados   of GCC tell me that it is better for several reasons, though performance isn’t   one of them. One is that as you have or can get the source “of any version” you   can re-build any old version of the compiler. Whilst this is true unless it is   re-compiled with an identical binary of the original compiler used to build it   the result is not going to be the same. The new “old version” will be a   different beast and behave differently. If this was not the case people would   not need older versions of compilers for maintenance on legacy project’s in the   first place.
        
        The chances of having an already built version of the   original compiler used to compile the old version of GCC are quite low. So it   will be almost impossible to recreate an exact copy of an old version of a GCC   compiler. The argument used was that you couldn’t get older versions of   commercial compilers. This is not true. At least it is not true for all the   commercial compilers I know. More to the point the commercial companies have all   their sw under [version] control and keep binaries of all versions they produced   so they can issue an exact version. Also the issue of dongles etc is normally   not a problem on the old versions of compilers despite the claims of the   open-source camp. 
        
        The next argument was that with the open-source you   could fix bugs… This is extremely unlikely for several reasons. Firstly   compilers are complex things, though I am informed from several sources that GCC   core is only about 5 years behind the average mainstream commercial compilers it   is still a complex beast. The chances of a person not involved in the   development fixing a bug without introducing others or having other side effects   is quite low. The more important point it that, after many years of answering   support calls for compilers I have found that large number of “compiler bugs” I   have had to deal with were in fact not bugs just a miss understanding of correct   C behaviour. Also given the arguments in the C language reflectors it is   unlikely that anyone not in the C standards loop is going to have a the correct   information to fix compilers. Yes, some of the GCC core maintainers are on the C   standards panels. Though it has to be said there are several companies that   maintain their own versions of the core.
        
        Finally, If you fix “bugs” or   rebuild an old version from source you will need to fully retest the compiler   you have built. Commercial compilers are usually rigorously tested against   industry standard test suites such as Perennial and Plum-Hall and other systems such as Paranoia that test   specific areas of the libraries as well as rigorous in house test suites. As far   as I know GCC is not tested against these industry standards as the commercial   test suites cost a lot of money. 
        
        You could use the GCC test suite but   this only really confirms it has built correctly and conforms to GCC C, not to   ISO-C. You also need the correct version of the test suit for the compiler and   be able to show that the test suite used is a valid file and not an edited one.   The other point is that even if you did test a version of the GCC it would only   apply to that particular binary, or copies of the binary not the source. As soon   as you did any changes to the source you need to retest the whole   compiler.
        
        Part of the problem here is one of liability. If you use a   compiler you have modified or built yourself you are going to have more of a   problem showing due diligence than using a built and tested commercial compiler.   This is because you will have to show that the compiler you have built is fit   for purpose. 
        
        Interestingly there is a GCC that is available for safety   critical work… It is supplied as a binary and it is maintained and tested by the   company that supplies it as part of a package of other software. It is not   inexpensive either. If you use any gcc bar the one they supply then all the   guarantees are invalidated. Actually they are not alone. Several companies   supply gcc packages though not for safety critical work, at a cost. Technically   the cost is for the, IDE, the support tools, the installation system or the   support. In effect gcc is becoming commercial and should be judged against the   other commercial tools as such. Well, I have dug myself a hole and will probably   get buried by the disciples of GCC. 
        
        In a similar vein I saw an article   in Computing (August 10th issue) about cryptographic software. The author was   complaining that consumers of crypto SW had no real idea what was good and what   was bad crypto implementations. The trouble was most of the poor quality systems   were being sold at a much lower price than the high quality systems. The problem   as the author saw it more people were using low priced, low quality crypto   software whilst at the same time reducing the market for the more expensive yet   quality software by either lower sales or drastically reducing their prices. The   net result is the spread of lower quality crypto and the reduction of the high   quality vendors in the market.
        
        The IT industry is the only one I have   come across where it’s practitioners delight in using low quality and cheap   tools. I find it very strange. There are parallels with the building trade where   carpenters, electricians etc buy the best tools they can but the cowboys and   DIYers buy the cheaper tools. If we all followed the same trend into our private   lives we would all be driving Trabant’s.
        
        On the standards front I hope to   have some MISRA-C news shortly re the TC and the example suite. The MISRA C++   team are well into their work as are the MISRA-Autocode team. However the   MISRA-UML team are still working out their frame of reference. If anyone is   interested in working on MISRA-UML please email me. 
        
        I can report that   due to a minor spat between two members of the BSI C panel on the panel email   reflector one got suspended which lead to more exchanges which lead to the   convener suspending all of the panel and then resigning after 8 months in the   job... So there is currently no BSI C panel. Hey Ho! I put it down to the   weather.
        
      Finally according to The   Register the humble PC is 25 years old last month. Can you remember where   you were when IBM launched it in summer of 1981? The PC had a character based   monochrome screen, no colour let alone graphics, 640K RAM and ran at 4.7mHz!   Incidentally it also had a cassette tape interface and no hard drive as I   recall. Apparently PC World has one on display in their branch at Staples Corner   in London. Whilst almost every part of the PC has changed most are still beige.   The thing that made the PC so popular was the fact that IBM forgot to patent it.   This meant that others could produce them too. IBM did patent the MCA, OS2 and   PS2 which did not go far. Come to that neither did any non-standard PC's. The   answer is standard interfaces and if you must do a new one you need an industry   group to back it to give it critical mass.
(1)There are known knowns, there are known unknowns and there are unknown   unknowns.
          
        That is that is there are things we know that we know. There   are things we know that we don’t know. There are also things we don’t know that   we don’t know. 
        
        However, I would also add that there are things we know   but we don’t know that we know them…
Eur Ing Chris Hills BSc CEng MIET MBCS MIEEE FRGS FRSA is a Technical Specialist and can be reached at This Contact
Copyright Chris A Hills  2003 -2008 
      The right of Chris A Hills   to be identified as the author of this work has been asserted by him in   accordance with the Copyright, Designs and Patents Act 1988