*** ----> Battle for Safeguarding Using Technology in Businesses | THE DAILY TRIBUNE | KINGDOM OF BAHRAIN

Battle for Safeguarding Using Technology in Businesses

It is unthinkable not to have computers in all organizations and companies, all types of transactions and processing is being carried on computers. However, even though the use of computers has a great potential, it also puts businesses at risk due to cybercrime and business of hacking, which created the need for security defenses. 

Although earlier desktop computers existed, the computing technology came to being with the launch of the first personal computers back in 1977 which included the Commodore PET, the Apple II, and the Radio Shack TRS-80 ushering in the PC era. All of these computers were built to serve simple needs that lacked the very basic defenses against potential threats such as hacking. 

While these computers were considered as standalone components, the need for communication between computers, entities, businesses and, ultimately, people began to rise as the Internet came to existence. The development of the Internet and its use as a public communications network changed the game for attackers and defenders alike. As companies began to use the Internet for wide-area connections and services like email and file transfer, and as they established websites supporting business transactions, the Internet became the battleground for the next generation of cybercrime.

As businesses and computer users connected computers in networks, cyber criminals developed computer viruses and worms that used the network to infect systems, steal data, deny service, and spread to other systems. All was way too easy with the public connection the Internet has facilitated. Xerox’s Palo Alto Research Center developed the concept of the computer worm—for benign applications—in 1979. But 23-year-old Robert Morris opened the eyes of the world to the worm as a malevolent threat in 1988 when his computer worm infected 6000 of the 66,000 hosts on the Internet. He became the first person in the United States convicted of a computer crime. 

Similarly, there came the need for computerized physicians who act as defenders against viruses. These physicians are referred to as anti-virus software, which scans systems to spot and isolate known viruses. Because the attack is ever changing, anti-virus products are backed up by ongoing security research to identify new attacks and automatically update the anti-virus software to recognize them. Effective anti-virus defenses keep attackers busy developing new exploits, effectively increasing their R&D investment and driving up costs in the hacking business.

Still, where these defenses held strong against potential threats against viruses and worms, communication between the internal network and the public internet was a playground for attackers. In the late 1980s, businesses and public institutions began to connect their internal networks to the new public Internet, so they needed a way to prevent outside users from accessing their private systems with common protocols like the file transfer and telnet protocols. The initial firewalls were simple packet filters that passed or blocked packets based on rules specifying which protocols and ports were allowed. Subsequent improvements extended firewall rules to encompass knowledge of users, sessions, and applications. And later, next-generation firewalls (NGFW) embedded intrusion prevention system capabilities to spot and block known attacks based on their signatures.

Firewalls have slowed attackers down and driven up their costs by making them look for unlocked doors and hiding their attacks within allowed protocols and ports. They rely on social engineering to convince users to download and run malware in allowed communications like HTTP and email.

As the business of hacking grew exponentially throughout the 1990s, IT organizations have realized that an early detection of a cyber-attack could help businesses mitigate potential damage. For that reason, IT organizations have started collecting and examining system log files—usually for the purpose of problem investigation or resource accounting. Security professionals realized that consolidating and correlating activity in diverse log files could detect suspect behavior like unauthorized or excessive login attempts, unexpected data transfers, and other indicators of attacks. Log file management systems provide a common set of tools for storing, consolidating, correlating, and searching log files. To avoid detection, attackers often attempt to cover their tracks by modifying or erasing log files. This adds complexity to their task, and it leads them to “low-and-slow” attacks that strive to avoid detection long enough to obtain significant amounts of data.

While the computing technology helps businesses serve their customers faster and better, it comes with a great cost to build strong security defenses. No sole security defense will ultimately keep your business safe. Accordingly, a diverse, ever-changing set of cyber security defenses will have to be established to manage potential threats and slow them down.