General > General Technical Chat

Major cyber attack - Who is at fault?

(1/3) > >>

VK3DRB:
https://www.theage.com.au/politics/federal/morrison-reveals-malicious-state-based-cyber-attack-hitting-several-sectors-20200619-p5545z.html
The Prime Minister did not mention who launched the attack, but it was almost certainly China.

We will always be open to rogue states and individuals attacking organisations. It is up to the organisations involved to ensure they have competent people hired to provide adequate defence against attacks. It seems the victim is as much to blame as the enemy. If someone leaves their keys in the car that is parked at a shopping centre with the door unlocked and the car gets stolen, who is to blame? Both the thief and the owner of the car.

If standard electronics engineering practices were used in IT, our infrastructure would be less prone to attack. In a product, we provide protection from serious external disturbances (EMC susceptibility, over voltage), protection to the external environment (EMC emissions, overcurrent, fire) and protection to the user (safety). The product must meet regulatory approvals. It should be similar for IT infrastructure. All too often the IT industry is reactive, not proactive. In electronics, we don't wait until someone has been electrocuted before we decide on insulating a circuit.

Why are organisations so vulnerable? Is it because of clueless IT managers who evidently pervade industry and government? Is it because software vendors leave too many vulnerabilities in their products? Or is it just too complex compared to a circuit? Or some other reason?

T3sl4co1l:
A variety of reasons, those among them, yes.

EMC is easy; we can, in principle, calculate or measure the fields, and design filters by analytical methods.  It may not be easy, or practical even, but it is eminently possible.  We can generate fields of arbitrary nature, and usually it suffices to try a few combinations of modulated frequencies and impulses.  Stateful signals are not typically required (if they were, there's still a method to explore that space in polynomial time, but it's not exactly low order).

Whereas for software, we have some methods to touch the edges of things -- static and dynamic analysis, fuzzing, etc. -- but no general approach, and provably so!  (Checking a general program for correctness is evaluating a proof of that program; the proof is itself a program; that proof terminating is therefore a Halting Problem.)

We're even more hopeless at social and political systems.  Humans will always be fallible, selfish, lazy, and suggestible.  Computer science, in and of itself, is in general intractable; with humans writing it and interacting with it, it's essentially a guarantee there are bugs.

Tim

noreply:

--- Quote from: VK3DRB on June 19, 2020, 01:26:25 am ---
Why are organisations so vulnerable?

--- End quote ---

Because 'convenience' is INVERSLY related to 'security' ...

and the IT industry - INCLUDING users (the public) - prefer to have 'convenience' without realizing the above direct relationship  :P

golden_labels:
noreply and T3sl4co1l provided two reasons, which IMO are the most important. But there are also psychological factors.

Human brain is horrible at probability perception. The further one moves away from 50%, the more they round the perceived likeliness to 0% or 100%. And it happens fast. At the same time people rarely think with proper rigour, but rather make guesses based on what they have experienced either personally or heard about from other people close to them. Those two factors bring fatal consequences: greatly underestimating risks caused by low-frequency events. Security breaches are such events.

Emotional state, caused for example by framing, by environment in which one is or by personal beliefs, affects risk perception even more. Just like people, properly conditioned, may be afraid of very unlikely events to the point at which they consider it as a risk greater than something nearly certain, humans tend to ignore even considerable risks.

With wrong perception of the risk, what incentive is there to care about security? Why would anyone care if “it’s not going to happen to me”?

While this is a less important factor, one should not forget about persisting traits in development and administration. Administrators, who do not take any criticism and avoid audits. If one can’t reveal the precise principles on which the network protection is based, this should be alarming. If the idea of pentesting is disliked by an employee responsible for security, managers should really ask the important question: why. Perhaps giving that post to a person, who has no relevant knowledge or their knowledge is a mix of personal fantasies and misconceptions from 30 years ago, was not the best idea? Overconfidence and overestimating own skills happens among programmers(1). Have you heard one dimissing comments on doing something usafe by saying “you can use it, if you know what you’re doing” (implicitly: they know what they”re doing)? This is yet another reliable red flag. Many people are also unable to say “no” to their superiors, despite knowing some ideas are bad.
____
(1) Among all experience-based professions, to be honest.

magic:
I will add the never ending growing complexity of software to the laundry list. Nothing will ever improve until the "Moore's law" stops and people are forced to start thinking about removing things not needed rather than adding things that might be useful, perhaps.

Navigation

[0] Message Index

[#] Next page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod