An interloper in IT

September 2015

A single black sheep among white sheepMike O'Neill MBCS, over the many years he’s been involved with IT, believes that any successful college of IT needs to be generously seeded with people whose background takes in life and business skills acquired away from the analysis and algorithm-led world inhabited by the IT professional. Here he provides a potted history of his own experiences working in business and explains how IT came to his rescue on numerous occasions.

When I was about 22 years old and wondering which way my working life should go, I took a career profiling assessment at what would now be called JobCentre Plus. At the end of it, the assessor made a couple of observations. The first was that my score in arithmetic skills was educationally subnormal. The second was that I appeared to want to do everything I didn’t have an aptitude for.

In 1982, I was given an Osborne 1, the first portable computer to be mass produced and designed to bring the power and productivity of modern IT technology into the lives of busy travelling executives. The Osborne 1 (designed by Adam Osborne) was arguably the progenitor of the modern laptop and was briefly very successful.

It was designed to be carried as hand luggage by executives (or at least those executives who could contemplate a non-rectangular piece of hand luggage weighting 23.5lbs), and was sized to fit neatly in the foot space underneath the passenger seat of a commercial aircraft. When opened, the latched lid revealed a QWERTY keyboard with a tray for pens and pencils.

The main case featured a tiny (5" diagonal) cathode ray-tube screen with 24 rows and 52 columns of characters physically available to view, though using the arrow keys gave you a virtual screen 80 columns wide. This Osborne had two 5 ¼ inch double-density floppy disk drives, each of 185kB capacity, and 64kB of RAM. It used the CP/M operating system. So far, so ordinary.

The Osborne’s unique selling point was that it came bundled with a complete suite of software including MBasic and CBasic; WordStar (the leading word processor at the time and with mailmerge capability); and Supercalc (a good spreadsheet. As if these weren’t enough, the bundle also included Ashton-Tate’s dBaseII, a genuinely relational database with a relatively easy-to-use programming language and runtime code interpreter.

In effect then, the business person had something functionally similar to ‘Office Professional’ at a time when personal computing was in its infancy, and portable personal computing was virtually a neonate. And all this cost around £1,000 to buy.

On learning to type, and moving North

I set to work learning how to use the Osborne and its software. First, I learned to type. The word processor made this a much more worthwhile exercise than using a typewriter as no effort need be wasted: I could both correct and re-organise my writing (and my thinking) as I typed. My organisation was starting to reduce headcount in the recession of the early 1980s, starting with secretarial resources, so the manager that could type their own correspondence was at increasing advantage.

I bought a Sharp Silver Reed daisy wheel electric typewriter (with special interface) for around £200, and I could now print to full typed quality anything I had produced using the word processor. And by photocopying a standard letter onto headed paper and using the Sharp to print the address into the window envelope position, I could even handle mailmerges provided the volume of letters was modest. I could also do budget forecasting with a Supercalc spreadsheet, build databases in simple relational form using dBaseII, and mine them for relevant data. 

In 1983 my job (in a construction research association) saw me take up post as regional information officer for the North West and West Midlands. As another exercise in cost-cutting, what had previously been covered by two people as separate regions now became one merged region.

The prospect of managing all the activity in a region as large and busy as this was daunting, especially using old fashioned paper-based filing systems. But with the power of the Osborne and its business applications, tasks that once took days now took minutes, and limited resources could be applied with a precision never before possible.

I quickly set up systems to track and flag up the key activities in my region. I built lists of offices and sites and the people associated with them, all suitably coded and indexed for business sector, geographical location, seniority of staff, and with other filterable interest codes.

The power of dBaseII extended to writing routines that allowed validation of input data so that no key data got mistyped and lost in the system. And using Supercalc, I produced budgets and other financial forecasts as well as other scheduling documents best suited to a sortable tabular format.  I was flying at last.

Finally, in 1984, our management board approved the purchase of a dedicated word processor and daisy-wheel printer, at a cost (from memory) of around £7,000. They based the machine in the London office and invited managers from regional offices to post handwritten letters and address lists for inputting and mailmerging in London, with the resultant letters being posted back to them for signature.

Turnaround time was between 1-2 weeks. But the Osborne was meeting all those needs and much more, to a similar standard, without the need of any central support, and at a fraction of the cost. Turnaround time was a few hours.

New challenges back down South, and the PC is born

In 1987 the cumulative effects of recession in the industry forced us into further, major cost-cutting. All regional offices closed, and I was asked to return to the London Area and jointly to set up a small team tasked with creating a national training provider of short courses to replace the large residential training centre which was being closed.  No-one thought we could do it.

The old training centre had never been proactive with marketing or customer relations: it had simply been part of an old construction industry that was now disintegrating. So we were faced with multiple challenges and a very short timescale in which to address them. Our small team of three would have to fish for and capture details of people, organisations and their interests.

We would have to be able to do complex financial forecasting. And we would have to handle a complex matrix of event schedules, finding and booking venues, booking trainers, scheduling course material production and so forth. And we’d need to do all our own administration. The Osborne rose to the task admirably, and with the arrival of the Hewlett Packard Laserjet 2 printer, it could generate paperwork at a decent speed. From basic principles, we established simple customer relationship management, and the effect was astonishing.

Requests to go on the mailing list poured in, and well targeted personalised letters poured out, along with the brochures promised in the personalised letter. And Supercalc meant we could do ‘what if’ profit and loss forecasting effortlessly.

By now, the first ‘286’ PCs had started to arrive on the estate. These featured the MS-DOS operating system and were a generation on from CP/M machines like the Osborne (even though by now I had had mine souped up with an EPROM - erasable programmable memory chip - so that my software could be run from a solid state drive).

Ashton-Tate had brought out dBase III, which allowed up to 15 relational work areas (aliases) as opposed to the two areas / aliases allowed in dBaseII. But dBase III was still interpreter-based software and was a bit slow and clunky. A new kid on the block - Foxbase+ - ran much faster, but offered identical functionality using the same xBase language. I opted for Foxbase+ and migrated my data and applications onto an MS-DOS PC, using a conversion utility on the Osborne to create PC readable disks on which to put my work and transfer it. 

Foxbase+ was fast and things really started to take off, but there remained two problems. Firstly, Foxbase+ required that each person have a copy of the software on their PC, and our tiny budget did not run to buying multiple copies. Secondly, my new job meant I was travelling a lot in order to set up and run events, but with much wasted time on site once the event was in progress.

A portable computer allowed me to use this time productively. However, although the first MS-DOS laptops were starting to appear on the market, their price tag was also outside our budget. We were a small ancillary business after all, and although the larger employer might opt to roll-out desktop PCs, we were way down in the queue when it came to prestige items like laptops.

The mentality that had characterised who had what kind of company car had migrated to the world of IT accessories, overshadowing assessment based on business need. So it was that the Osborne became my laptop, and I used my own money to buy  Foxbase+ and a couple of good books from which to steal code for needed new functions and procedures and type it in as ASCII text whilst away on the road. When I returned to the office, I would transfer my work to the PC for debugging and commissioning.

A Clipper arrives from Nantucket

In 1988 I discovered Nantucket’s Clipper xbase compiler. This product (which had been around since 1985 and was by now becoming an established favourite) was the dark secret at the core of the xBase world. It allowed 99 work areas / aliases, compiled dBase code to something akin to the C language, provided interfaces to functions written in C and Pascal and generated stand-alone .EXE files that could be distributed royalty-free.

Applications ran blisteringly fast, the more so since Moore’s law ruled. And, eight years after buying it, I could still type away on the trusty old Osborne in order to design and build the rudiments of my applications. The Osborne would throw a wobbly from time to time, as the disk reading heads on its floppy disk drives were prone to moving out of alignment, especially if it was subjected to too many bumpy rides in my car. But I had a trusty repair man in Warwick, who would always get it working again.

Clipper was a revelation, and was taken up and used for a lot of mainstream business applications. It was a language that was loved by all who used it, and the world of Clipper was fertile with exciting third-party products that enhanced its power and potential. Fast linkers like Blinker allowed the creation of dynamically-linked library files.  DLLs were a new thing to me: using them I could design applications so that parts could be swapped in and out of RAM as required.

This meant there was effectively no limit to how large an EXE file could be, which in turn meant I could now pull together several applications into one central super-application. Advanced index creation tools like Comix meant that multiple indexes on each database table (.DBF file) could be held in one .CDX file whereas the original Clipper/xBase architecture demanded a separate .NTX index file for each index file in. And since more files meant more memory consuming DOS file handles, fewer index files was a welcome benefit. Memo field editor products brought word-processor functionality to the editing of memo field data.

First class programmer’s editors such as Multi-edit were configured to be fully useable with Clipper and xbase generally. And excellent report writing tools such as R&R Report Writer allowed sophisticated data presentation through sorting, grouping and high quality multi-font printing.

Clipper applications also allowed the running of other OS based products from within the Clipper application. So Wordperfect and Lotus 123 could all be called and run from within the application, allowing some avoidance of the single application limit imposed by MS-DOS. In fact, using Clipper’s keyboard wait-state polling functions, a Clipper application could allow all sorts of pop-up applications, reminders and so forth that would be more traditionally associated with the Windows world of today.

Necessity is the motherboard of invention

In 1993, after several years of good earnings but questionable investment strategy, the effects of a new recession in civil engineering started to take their toll on our customers, and we faced a serious loss of business. We had considered paying a mainframe applications house £80,000 to replace the obsolete CRABS (Course registration and booking system) on the organisation’s HP 3000.

I had been suggesting for some time that I could write a PC-based application at zero cost, but the idea had been rejected. Now, with a forecast loss that meant we faced closure and redundancy, I suggested again that I write an application to allow us to operate a multi-user system based on the PC-based Novell network the organisation was setting up to replace the HP3000. This time I was given the go ahead. I migrated the organisation’s key databases from the mainframe computer and placed them on a 286 PC with a 20Mb hard drive.

I wrote an application to mimic completely, on screen and in output, the front end of the mainframe’s course booking system, but all driven by Clipper code, kick started using FoxBase’s screen and code generator tool. I called it CRAMS (course registration and marketing system) as a poke in the eye for the old system’s marketing shortcomings, and because I wasn’t keen on our competitors spreading rumours that ‘we had CRABS2’ or some such.

The result of all this was that my older and non IT colleagues could use CRAMS from Day 1 because it mimicked completely the old CRABS they were familiar with. Using the two printer ports on the PC, letter quality documents would be directed to a Laserjet printer, and the invoice batch report to a dot matrix printer.

Working with a software consultant from Sage accounting software, I wrote an export file routine that would allow our pro forma invoice data to be pulled into the back of the organisation’s main accounting system.

At present my application was still single-user, and we moved to our new office with only one PC, on which a single person might do all the work involved in booking and invoicing delegates. But whilst the organisation was setting up and debugging its new Novell network, I created the record and file locking and other semaphore functions that would be needed for my application to move into a multi-user environment.

When the Novell system was ready, so were we and the new system went into use with minimal fuss. An essential of its success was that the five of us in the training provider business were all stakeholders: it was in our interest to make the application successful.

We all functioned as beta testers and product developers. When the application fell over, I identified the problem and fixed it. When it didn’t do something that would be useful, I would put that on a list of jobs to be done when our business was naturally quiet, usually over the summer break.

Before long, we were doing some quite clever things. Using a product called Faxnet, I was able to write functions that allowed us to scan maps, course brochures and other documents, link them to relevant records, and instantly fax them to delegates and enquirers. This saved a huge amount of time over the established way of doing these jobs, and it encouraged the capture of names and other details from those who had contacted us.

We were also able to do fax broadcasts, selecting recipients using sophisticated data profiling, and with a ‘fax preference’ field indicating where people had confirmed they were happy to receive information by fax. One example illustrates just how powerful this tool could be. We had won the contract to organise the Sustrans conference, organised to announce the launch of the proposed national cycle network.

Nearly 400 delegates were booked, with the Minister of Transport, the Rt Hon Steven Norris MP due to give the opening address. At midday on the day before the event, we learned we would have to move the event start forward 30 minutes in order to match the minister’s diary.

We faxed the news to all delegates (for all of whom we had captured a fax number as part of their event booking). On the day, all but three had received the fax and amended their plans to suit. This was in the days before reliable corporate email, so it was our only option.

The internet arrives in the business world

It was now 1995. Delegate questionnaires, course folder document inventories, electronic document management and storage, invoicing and purchase order management, profit and cost analysis and so forth were all being done within a single application. The business was surviving (just) in its new leaner form, and interest in the internet as a medium for publishing and publicity was just starting to gain momentum.

I took a quick look at how to compose web pages in HTML and was struck by how the hyperlink <HREF> and <AREF> functions lent themselves to the use of parent record ID (the go-from in HTML ) and child record ID (the go-to in HTML) if I wrote a function to write descriptive pages out of our relational database application.

In other words, it should be very quick to write some functions that allowed quick publishing (and re-publishing, but more of that later) of a nest of related HTML files, containing any information I cared to publish. And to avoid any case sensitivity issues, should the files end up in a Unix environment, I would name files using numeric unique identifiers (for example, 123.HTM).

Within a couple of days, I had enabled HTML web file publishing out of the application. Nowadays, we appreciate fully the global power of the internet in promoting a brand or an idea, but in 1995 the idea was still quite novel.

In 1997 I left the construction industry, after 18 substantially happy years. Our once proud organisation of 500 people had now dwindled in number to barely 30, and our business (in which I had no share other than that of employee) was being sold to a large publishing house. I was head-hunted by a large and expanding corporate law firm.

One of the partners there had started a training provider as an ancillary business and client acquisition device, and was seeking to expand the brand from fledgling to provider of choice. My skills in this area had become transferable - though producing director-level training programmes on business and law would be a culture shift from courses on concrete for clerks of works.

At interview, I was asked by one of the partners how many staff I would need to build a turnover of £1m per annum. I replied that it would depend entirely on what tools those staff had for processing business.

My prospective employer was at that time using word processing software for invoicing and all other business needs. It was an incredibly cumbersome way to work. With the blessing of my old employer, I took my dBase/Clipper application with me and put it into harness in my new job.

By now it was an MS-DOS application operating out of a DOS box in a Windows environment. But it was resource self-sufficient, still worked very effectively, and was an instant success. And I was far too busy as a business manager (and family man) to invest in learning a new programming language and build a new application.

From LAN to WAN - dealing with poor connectivity

My new business team was split between Manchester and London, and this provided an interesting challenge. CRAMS was designed for use on a local area network, but was now operating over a wide area network of variable performance. I solved this by installing the application and datasets on both servers and then writing functions that understood at what mapped locations both copies of the application lived.

If any change was made to one of the data sets, the application at that site would attempt to open the relevant tables and lock and update the records on remote copies of the application, before updating its local copy. It would re-try this for a specified period, asking the user to wait.

If the attempt timed out, it would advise the user and put itself into ‘read-only’ mode so that reports, labels etc could still be got from the system, but new data could not be added. This simple approach worked remarkably well, since interrupted connections on the WAN were usually short-lived, even if they were annoyingly common.

It allowed much better performance than trying to have just one copy of application and files, and to try and run the application from a remote location. And it meant any outputs required at short notice (delegate badges and lists, for example) could always be got.

IF internet THEN E-commerce ENDIF

By 1997, the business2business corporate world was starting to look at the internet and to want a presence on it. Our law firm was no exception, and using my application I was quickly able to build - and, more importantly, maintain - a lot of dynamic changing information on a website that we had created specifically for the training business and linked to the corporate home page.

My co-director, a businessman of long experience, expressed concern at publishing our intentions about products and pricing. Our competitors (publishing houses who were Goliaths in size compared to our David) would see what we were up to and produce conferences ahead of us and at lower cost.

I suggested we could re-publish our information instantly, changing content and pricing as we pleased with no production cost; whereas our competitors, in the absence of a similar facility, would have to commit to the considerable forward cost of designing, printing and mailing brochures.

Meanwhile our website would allow us to project a brand image to match that of the big operators, even though our financial resources were much less than theirs. It worked, and played a major part in swinging the market towards us. By 1999 we had become provider of choice for many leading and emerging UK brands.

The new talk was of e-commerce, and names like eBay and Amazon were starting to reach the public consciousness, even if they were still substantially a novelty. But in 1999 e-commerce in business-to-business transactions was almost unheard of. 

We didn’t have serious money to spend on anything as speculative as that, and the security of online money transfer was still a major inhibitor to market growth in all online transactions. But we were forward looking in our ideas, and we wanted to be there with the first to try out e-commerce.

I made contact with a small software house in Hull who specialised in e-commerce applications. For £3,000, they said, they could set up an e-shop complete with online purchasing facility. I thought the estimate was unrealistically low: I would have to debug the product to make it fit-for-use.

I would also have to write the application that would upload the product and pricing data to the e-commerce package. And since our product was not the kind of thing the e-commerce software had been designed for, mapping our data to its structure would be fiddly. But I reckoned it was all doable and that I had sufficient skills to be able to work with the supplier. And so it proved to be.

The E-pilogue

Three months and a lot of hard work later, but without any overspend, we had by the middle of the year 2000 a reliable working e-commerce shop. The e-commerce engine linked seamlessly to the basic HTML pages published by my business application. The whole lot was ready to go, subject to the boards approval. And then the dot.com bubble burst, and the visionary founder of our training business died tragically and suddenly at the age of 47.

The business passed into new hands which, eager to clear the decks of those whose hands it had been in before them, persuaded the ultimate owner that e-commerce was dead and that our investment in it had been a waste of time. The new hands ousted the old ones. I initially found myself redundant, but then found a new career as a civil servant at the Home Office, where I now work with a team on the extensive Adelphi (Oracle) database that meets all our HR, purchasing and procurement needs.

At their suggestion, I have finally joined BCS, some 33 years after I used to attend British Computer Society meetings, feeling like a complete outsider amongst the clever and schooled minds of the IT professionals with whom I shared the room.

These days, I don’t feel like an IT outsider any more. I keep my hand in doing a little vanilla flavoured VB programming in an MS Access dbase that I use for logging and progressing my work. I’m respected and valued by my colleagues as someone who came out of the world of business and end-users and taught himself IT on the factory floor, in a world where if you wanted to build something, you just did it, with whatever skills and artefacts you could pick up along the way: I never went on even one day’s IT training in all those years - way too expensive!

All that carried me through, in the absence of any aptitude, training or recognition, was the excitement of gaining competitive edge by going where no similar business had gone before. I feel as strongly as I ever did that IT must bind the business to the customer, to the benefit of both; must adapt to the changing market; must be enticing to the operator as well as to management by making the jobs of both easier to do.

And the IT profession must work to include in its ranks those with the right empathy for business and for the people who have to use IT in their work. In the small teams I was working in, I was both manager and operator.

I never wrote applications so that IT could replace people - I wrote them so people wasted less time on time consuming and error prone ordinary tasks and could apply that energy instead to improving their business, or at least on preserving it and the livelihoods of those who depended on it, whether employee or customer. 

My old Osborne now lives in the Manchester Museum of Science and Industry, who I approached in 1998 about taking it, as it seemed wrong to throw it away as just so much obsolete plastic junk.

They snapped it up, interviewed me for the archives, and it now shares the same home as ‘Baby’, designed at the University of Manchester and the first machine with all the components now classically regarded as characteristic of the basic computer. I’m really quite chuffed about that. Now I just need to find a suitable museum for me...

Image: iStock/484005076

Comments (5)

Leave Comment
  • 1
    Geoff Wood wrote on 22nd Sep 2015

    A well written and authoritative survey of the last forty years of developments in this field.

    Report Comment

  • 2
    Glen Vaal wrote on 7th Oct 2015

    Mike, I had no idea. A very different world from the "Corporate IT" I've been used to. It would be interesting to know what you make of "Big Data".

    Report Comment

  • 3
    Mike O'Neill wrote on 12th Oct 2015

    Hi Glen. I'm not really qualified to speak on Big Data as I've never dealt with it. But from the little I've read might Big Data applications and relational database applications each have a place? The volumes of data I dealt during my excursions into IT were miniscule compared to corporate databases. All the business data we needed for our £1m per annum turnover fitted onto about 10 megabytes of hard disk. The applications were quick and inexpensive to develop and adapt and (crucially) they gave us tremendous competitive edge because no other similar small business had such an application until years later. Big Data may be invaluable to large organisations wanting to analyse huge amounts of data, maybe coming from different sources. But for so called small to medium enterprises (SMEs), it might be that the cost and learning overhead is too much and/or there just isn't the same need? I'd be very interested to get your views on Big Data, what it offers and who it is best suited for, as this is new territory for me. Does a Big Data application offer the same promise of referential integrity as the relational model in systems where it matters (invoicing, purchasing, booking, customer relationship management etc.)

    Report Comment

  • 4
    Paul Coyne wrote on 29th Oct 2015

    A great tale. You're so right about the need for diversity. Although I got myself a qualification before starting to work in IT I believe that 95% of what I use in my career came from elsewhere and I think I'm all the better for it. IT should be democratised and should use the talents that are available to it. You show so clearly how that can be.

    Report Comment

  • 5
    Mike O'Neill wrote on 8th Feb 2016

    Hi Paul

    I was just looking up my article for referring to a colleague and I saw your comment. Yes I've found that 'learning by doing' is what works for me, and I suspect it's true for many others and is perhaps what apprenticeships used to offer.

    On the issue of diversity, I've always felt that since an IT application will encapsulate a real world process/situation/whatever it needs a diversity of reviewers before it can be called fit for signing off.

    I had a nostalgic moment the other night when I managed to install vDos on my 64 bit laptop so that I was able to fire up my old 16 bit Clipper application in a simulated MSDOS box and look at it for the first time since it was authored nearly 25 years ago. It still holds up remarkably well and can generate a simple but quite useable (and completely data driven) website. Eerie to see it spring to life after all these years.

    Report Comment

Post a comment