A Comparison Of Legacy Systems And Their Significant Contributions To The Information Age
The last two decades have seen a significant change in IT environments. The early 2000’s ushered in LAN/WAN systems integration, which allowed the enterprise to leverage public data networks (i.e. Internet) to interconnect their operations to share and exchange information. Thus, it was the start of the Information Age. It brings the results of the electronic, digital and computer age together to bring more innovation in IT that has led to the birth of new industries. From this companies like Amazon, Facebook, E-Bay and Google were born.
The early 2000’s was also a time for transition in many enterprise operations. The monolithic systems of the 80’s and 90’s were being upgraded to become more flexible and efficient. This was also right after the Y2K scare, in which many companies had invested in hardware upgrades that corrected a firmware limitation in personal computers that did not recognize the format of the year 2000. Disaster was averted and the effects were seen mostly as minor problems. That does not mean that most systems continued to work smoothly, there were issues reported worldwide.
With new computers and networking equipment, the world’s top companies began their digital transformation. It goes beyond just upgrading hardware, but goes hand in hand with software migration for better performance and more productivity. Microsoft emerged as a leader in this space with their business solutions to client/server computing and software development environment based on .NET and supported by the MSDN (Microsoft Developer Network).
It was during this time of transformation between 2000 and 2010, that many standard office systems were upgraded. It was so revolutionary in fact, management in many of these companies were amazed at the cost savings and the way it made doing tasks much faster. In my own observation, when something makes you more efficient it actually also means more work. This is because of how career motivations push productivity, and with the new tools available from hardware and software, it leads to working harder but also smarter.
I am going to look back at 3 legacy systems that at one time were a core part of many enterprise IT environments. Many have been retired, but they served their purpose well. What these systems provided were the foundation for the cloud and interconnected Always On, Always Available Enterprise network. What we have to remember is that nothing starts out perfect. They were usually rudimentary, like ENIAC the world’s first digital computer. It once occupied an entire room the size of a gymnasium, while computers today can fit in the palm of your hand with greater processing power than ENIAC.
AS/400 With OS/400 And AIX
During the time I started out as a systems engineer, one of my tasks was to setup an AS/400 server for testing and support purposes. The company I was with at that time purchased a brand new AS/400 from IBM in order to support clients who run their applications on this platform. This was a time before the cloud and the term “Groupware” best described collaborative applications. The application suite involved here is Lotus Notes which was used by premium clients, also called the Lotus Domino Suite.
Out of the box, this was a high workload server that provides backend services, like application and data processing. The server has built-in networking components (NIC, no Wi-Fi at that time), a huge enclosure frame in a rectangular box shape and power cable. It takes 2 people to lift for the most part (unless you are really strong) and does not come with a monitor or peripherals. It also has 2 drives (not a floppy), one CD-ROM and a DVD-ROM.
Setting up the AS/400 was pretty difficult. Just when you thought it was easy installing a server, an AS/400 makes you think again. It is a tower unit that does not fit in the U rack mount used for most typical servers. We had to dedicate a section of the server room for it because it also required its own power outlet since it consumes plenty of power. The most confusing thing about it was how to turn the server on. This may be the most basic operation of all, but even that is not as simple as pushing the power button.
The operating system is OS/400 (V4R4) which is unlike Windows, but more similar to Unix-based OS. The system we had used both OS/400 and another operating system called AIX. It did not come with a user friendly GUI, making the CLI using a terminal screen the best way to access the server. Despite all the difficulties of setting it up and configuration, this was a stable box.
A Wndows-based server (using Windows 2003) during those days were fine for domain controllers and file sharing, but not for multi-user applications. PC-based application servers running Windows with Microsoft Office Backend including SQL Server and custom application was ideal for environments with 300 or less users. The main advantage of an AS/400 over these types of servers is its performance to handle huge computing loads that can support up to thousands of users. Mainframe class systems can scale to millions of users, so a mid-range like an AS/400 is worth its value.
An AS/400 is a mid-range server system (now the IBM System i or eServer iSeries) ideal for large workgroup environments, just below a mainframe class server, so it is an expensive piece of hardware. In this case it has Lotus Notes installed to provide access to hundreds of clients (this was based on licensing). The client computers are running either an IBM PC or clone workstation. Lotus Notes provides the clients access to e-mail, calendar, scheduling and word processing applications. This was a high-end solution to client/server computing architectures during that time.
When it comes to application server performance, the 64-bit RISC architecture of the AS/400 is a huge performance boost. This was a time when most production servers were not full 64-bit, but running a 32-bit version of Windows or Linux and other Unix variants. The CPU architecture is based on the 64-bit Power PC, which runs at clock speeds up to 1.9 GHz. The CPU also uses 3 levels of caching and 8 GB of physical memory. The RISC design allows the processing of large amounts of data with the least number of operations as opposed to the x86 design which uses complex operations on a small amount of data.
Many of the features of the AS/400 today would be below standard. A smartphone can outperform an AS/400 in data processing, with a more powerful multi-core CPU like on the iPhone. Such a small form factor can pack more punch then what they had during those days. The AS/400 however, was built for multi-user access, something smartphones cannot really handle. Some companies still require the use of their AS/400 for running their legacy software applications and at others times they have been repurposed for other functions (e.g. data pre-processing for machine learning). Such systems are being replaced with cloud-based software and collaboration applications that use a web interface rather than a compiled client application.
Sun UltraSPARC III Workstation With Solaris OS
Before 2010, if you had an UltraSPARC workstation, you had a high-end desktop. It was unlike an HEDT though, since it was not an optimized gaming machine but more for data processing. Sun Microsystems (now part of Oracle) released the UltraSPARC CPU architecture for their line of workstations to rival the performance of competitors like IBM, Digital Equipment and Compaq during its time.
An UltraSPARC III workstation uses a CPU codenamed “Cheetah” so its marketing is to give it an image of speed when it comes to computing. The CPU was designed by Sun Microsystems but fabricated by Texas Instruments, a work of American high technology. The UltraSPARC uses a superscalar architecture which implements a form of parallelism using only a single CPU. This is of course how later x86 CPU would implement multiple execution pipelines with multi-threading support.
The UltraSPARC III workstation was available for systems engineers to play around with the Solaris OS. Solaris was at that time a licensed Unix-based operating system (now available as open source). We would install Checkpoint FW with a dual-NIC configuration to build a software-based firewall. We then install and test our company’s cybersecurity software application which uses stateful packet inspection of incoming Internet traffic.
These workstations were pretty cool, in design and performance. 900 MHz was considered fast before 2010, so these workstations can render nicer graphics and run applications with less lag and with more memory even better performance. This was a time when the lag was on the computer level, while nowadays the application has been moved to the network and it is now bandwidth lag. Firewalls today are mostly appliance devices which are hardware based that connect directly to the network at data centers. Other firewalls are implemented on SDI (Software Defined Network) private cloud networks.
There was a time when enterprise operations used an on-premise PBX as their private telephone system solution. A PBX provides a switched telephone system for use within an office. It can have lines that interconnect with the public POTS/PSTN telephone network using a code and each person can have their own extension number.
A PBX is actually a complex network of telephone lines that run around the office. Each cubicle has its own telephone port that is then connected to the main telephone exchange which is usually installed in a telephone (or utility) room or data center. Troubleshooting can be a daunting task and re-programming phones required an actual technician for service at the user’s desk. Little by little, more automation came to PBX systems that allowed administrators to control the exchange and configuration from a console.
There were dedicated consoles that connected a computer directly to the PBX switch like with Nortel and Avaya systems. The network or systems administrator interfaces with the PBX system using this console to setup numbers, monitor traffic statistics and implement switching and trunking protocols. This eased up on system management, but doesn’t reduce the complexity of the wiring and cabling involved in a PBX. Tracing connection problems requires going to a separate wiring closet for the phones.
Data communications professionals were moving towards convergence of voice, video and data in the 2000’s. While the 1990’s was the age of the digital electronic PBX, the 2000’s introduces us to VoIP (Voice-over-IP). VoIP was revolutionary in the sense that it integrates various systems together. It treats all voice, video and content traffic as data which can be routed over the Internet or transmitted on a LAN using the TCP/IP protocol suite. CISCO was a leading company in providing IP telephony solutions, and by 2008 the time began the migration to VoIP-based systems.
A VoIP system works just like a PBX, but the main difference is the underlying network. While a PBX uses a circuit switched network with dedicated wiring termination for each telephone, a VoIP system is configured with less wiring complexity. All you need is a TCP/IP protocol stack over which the telephony system runs. It supports the same features as a PBX like DID (Direct Inward Dialing), DND (Do Not Disturb), Call Waiting, Call Transfer, Music On Hold, Conference Call, Voice Mail, etc. VoIP introduces other features if the IP phone supports it like video conferencing, Follow Me Call Routing, Conference Bridge, SMS Text and Business Tool integration (e.g. CRM, trouble ticketing systems, auto dialing software, etc.).
The rollout for VoIP happened along with upgrading a leased line to an MPLS connection. It just made more sense to upgrade the communications infrastructure as a whole, to a digital packet system. The analog PBX system at one time had no connectivity to the LAN, and was like a silo. Integrating it into an IP network with VoIP made network administration more versatile and call management became simpler. This required using a line of CISCO products that relied on switches and routers to interconnect office locations.
With VoIP cable management became much easier too. The telephone can use an ethernet cable to get voice service rather than through a standard telephone cable. VoIP systems were also less costly on the burdens of systems management and administration. It was easier to configure phones, which makes it simpler for administrators in large user environments.
VoIP can also be cloud-based and supports the web for voice applications. If you use Skype or WhatsApp, that is an example of VoIP. You are communicating with voice over data packets across the Internet. In the past in order to talk to someone from another country you would need to make an overseas call. Now, all that is needed is an Internet connection and an application that supports VoIP.
The Legacy And Its Successors
If you are going to use the AS/400 or Sun UltraSPARC III mainly for testing, the use of virtual machines today which implement virtualization technology, provide a better and less costly solution. Although AS/400 systems are unique in their own way, a virtualized version would be much simpler to implement in a sandbox for a testing environment. That can be done on-premise by installing a VMWare server that can host Windows and Linux variants all on the same machine. These run a virtual environment instance of each server’s operating system. It is less costly to make mistakes since a server can be rebuilt using virtual images or restored from backup. It is also safer with less chances of physical damage since the instance is running in a virtual environment.
Many systems are also being converted to appliance boxes. Examples of this are network security devices like firewalls and cybersecurity applications. There is really no need anymore to install cybersecurity software on a dedicated server when you can just purchase an appliance. This requires less administration since the appliance already has the software installed and ready for use after configuration. The appliances also have the capability to get updates and do automated routine maintenance (e.g. backup, power cycle, full network scans, etc.).
The use of “Groupware”collaboration suites is also very rare with the availability of the cloud. Collaborative applications like Microsoft Office 365, Slack and Google G Suite provide the tools for businesses to share and present information. These are available directly over the Internet as a cloud service or SaaS (Software-as-a-Service). Some of these services are also available for free, but more robust versions are available for corporate or enterprise use. That means there is no more need to deploy a high-end on-premise server with software installed. Now you just need a client computer running any operating system that has an Internet connection. This is more flexible and accessible to users.
VoIP was just what enterprises needed. VoIP is easier and less costly to maintain compared to a PBX. The flexibility of VoIP allows users to communicate from anywhere, even from outside the office, using a public data network like the Internet. This made it ideal for inter-office communications across borders using a routed VoIP network. A device is not just limited to a telephone anymore because with VoIP, any computer can also function as communications device with a microphone or speakers.
An entire VoIP system can be software defined and functions as a “virtual PBX”, making it highly available from any device that supports an Internet connection. The popularity of speaker phones using VoIP has made conferencing more convenient as well, allowing the organizer to setup from anywhere that has an Internet or IP connection.
I am glad that the AS/400, Sun UltraSPARC III and PBX are now a legacy. They helped allow people process data and manage information more reliably. They were the solution during their time, now there are better and more efficient solutions available. The newer systems are also more cost effective in the long run because they are less costly in terms of maintenance and administration, though they may have a higher initial cost. Things today are also more energy efficient thanks to large scale integration of electronic components and power saving features. The thing I realize is that nothing remains forever, the constant in technology is change. What is the latest today will be the legacy of tomorrow.