Another victim of Lapsus$ was Samsung in recent days. We know that about 190 GB of data stolen with the leakage of Samsung by Lapsus$.
According to the analysts (mentioned as GitGuardian analysts), leaked Samsung source code showed that it contains thousands of private keys, and some of them will be very useful to cyber criminals.
Analysts have identified more than 6,600 private keys, usernames and passwords, AWS, Google and GitHub keys in leaked data. Meanwhile, they also mentioned that about 90% of the keys seems to used in internal systems and so it seems very difficult to use them for attackers.
While Lapsus$ – seems like have members both from South America and Europe – initially was attacking only Portuguese institutions, it seems their sights have expanded and in a short time, their name mentioned with NVidia, Samsung and Ubisoft incidents.
Last month, Birdgestone faced a cyber attack and were shut down factories on February 27. Power outages are reported at factories in Iowa, Illinois, North Carolina, South Carolina, Tennessee and Canada for long time after the incident.
After the incident, LockBit ransomware claimed responsibility for it. It is mentioned that data leaked by hackers and started a timer for the payment of the ransom and threatened to publish the data they stole from the company if the money was not transferred on time.
While wondering whether this event is related to the ongoing Russia-Ukraine tension, the developments show that the event is completely focused on financial gain.
Meanwhile, “For us it is just business and we are all apolitical. We are only interested in money for our harmless and useful work. All we do is provide paid training to system administrators around the world on how to properly set up a corporate network. We will never, under any circumstances, take part in cyber-attacks on critical infrastructures of any country in the world or engage in any international conflicts” post published by the group.
“Many people ask us, will our international community of post-paid pentesters, threaten the west on critical infrastructure in response to cyber aggression against Russia? Our community consists of many nationalities of the world, most of our pentesters are from the CIS including Russians and Ukrainians, but we also have Americans, Englishmen, Chinese, French, Arabs, Jews, and many others in our team. Our programmers developers live permanently around the world in China, the United States, Canada, Russia and Switzerland. Our servers are located in the Netherlands and the Seychelles, we are all simple and peaceful people, we are all Earthlings.”
Samsung has confirmed the leak of the company’s internal data, including source code associated with Galaxy smartphones.
“According to our initial analysis, the leak includes source code related to the operations of Galaxy devices, but does not include personal information of our customers and employees” Samsung officials told Bloomberg. Officials also added they has put in place new security measures and not expecting a similar incident in future.
The LAPSUS$ group claimed to have stolen 190 GB of data from Samsung, including the source code for trusted applets, algorithms for biometric authentication, bootloaders, and confidential data from Qualcomm chip supplier.
Firstly, LAPSUS$ shared a piece of data claiming leaked from Samsung and then, Samsung confirmed the data leakage.
Meanwhile, in RAID forum – an underground hacking forum, Admin portal credentials shared by threat actors.
Not too long, LAPSUS$ also stole 1TB of data from NVIDIA.
ShinyHunters claiming that they have the database of AT&T including sensitive information of more than 70 million customers. With a post they shared, threat actors demand 200k$ for this database. They shared this post a few days after a threat actor sold information about T-Mobile customers. T-Mobile has confirmed the data breach, however a relationship between these two events has not yet been determined.
According to the example records that ShinyHunters shared, database is including these information of the customers;
Name and surname
Social security numbers
AT&T claimed that the aforementioned information is not related with their systems and refused the breach.
Both AT&T and T-Mobile, have been marred by several security incidents in the recent past.
DLP is a technology we use more than one decade. The starting point of DLP was protecting IP (Intellectual Property) of the organizations and became very popular for too many sectors. Organizations spent, and still spending millions of dollars for DLP solutions, to protect their private data. However, Gartner says; “They become an annoying or toothless technical control rather than a component in a powerful information risk management process” about it. But why?
According to some surveys, the biggest challenge of the professionals is difficulty to keep policies up to date.at rate of business. The others are that inhibition of the employee productivity because of these policies, and limited data visibility. Also, too many false positivies are also very big problem for IT professionals.
If we talk step by step, requiring policies is really one of the biggest problem of DLP solutions, regardless of manufacturer. Before anything else, organizations have to know what data they must protect. For this, they have to know which data is sensitive for the organization. Most of the organizations started their DLP Project without knowledge of their sensitive data. It is very clear that it is impossible to know what the sensitive data is without data classifications. Again, most of the organizations learnt that after implementing the DLP, and started a data classification Project maybe years later. And of course, only starting or implementing a classification Project is not enough to classify the data. It is a very broad and continuous process, needs wide awareness by users.
So, because of this obscurity about their own data, organizations got their policies from others’ experiences, instead of their own needs. Industry experience became very important at this step then. Created and run the policies with hoping they will protect their data.
At the same time, just knowing what to protect is not enough, also you must know how to protect these data. If you do not know which channels can people use to leak data, it is also impossible to protect it. These channels also added the policies according to industry experiences. Even if the Security Risk Management professionals know what if they miss a required policy, they run these with the with the thought of preserving as much as they protect. Everybody knew that this is not enough for protetion all the data, then the slogan became like; “DLP prevents the user from doing wrong things, does not prevent the data leakage against the malicious users.”
One of the other weaknesses of DLP is focusing on content to identify the data. Even if the last features like AI, it determines the file with the content of it, using pattern match (like regex) or exact match. Very limited context examination is used. So, DLP is not effective against malicious users again since the conten can be changed very easily to leak, also in a living organization, the content of the sensitive data will be changed inevitably, and this situation requires that policies are constantly updated. But as I said before, new policies means that more possibility to inhibition of the employee productivity, more spending time to optimize these rules and more exclusions. More exclusions mean more vulnerability against data protection. More context focus is needed to prevent the data.
In big organizations, false positives are can be the biggest problems since number of employees, sensitive data and policies. A large number of incidents produced every day, requires more time, and sure more employee to review these incidents. And if you make a survey with these teams who are viewing the DLP incidents, they could say hundreds of incidents could be ignored. Actually, I believe that it would be a good situation if the organization can catch one or two real incidents in a year. The organization hopes that the captured incidents gets an acceptable ROI. Meanwhile, this organization never can be sure that nobody leaked any data.
Every IT Professional that used DLP know that there are many other annoying situations of DLP. For example, if you do not want someone leak data using endpoint channel like printer or USB, every PC needs an agent installed, and of course these agents should work as it should. This is a very big challenge against all IT personnel managing endpoint solutions. These requires focusing very strange situations, spending too much time on one PC sometimes, in a case of a problem, and a continuous testing of the agent. Not only incident analysis, also management of the DLP solution requires really many sources.
One last thing I want to mention, DLP inspects only at the point of egress. On the endpoint, printer or USB, in network layer; the internet access and in email channel, the emails sending outside of the organization. Data protection must also include inside the network like file servers. As we saw that the protection at the egress point is difficult and can be possibilities to leak the data (this can be because of policies, an agent with a problem, changing the content of the data, etc.), this item becomes very important.
As the result, DLP is not an efficient solution as expected. It must be continuous process, not a single Project by it is own. Despite all these, I do not believe that DLP will die. At least, in many countries, there are many regulations in different industries, DLP is compulsory. Regulations are requiring to have a DLP solutions including both endpoint, network and email channels. And still we do not have more efficient solution by itself. But, organizations must think to support their DLP solutions with some other solutions like UEBA or DaBA. Especially, DaBA solutions can provide a complete visibility of the movement of the sensitive data, in all over the network. Even if the users do not try to leak data outside the organization (so, it is impossible to catch it then), it is very important to know who is using this data in organization. So, the data can be followed with the need to know approach. If someone does not need a data for his job, he should not reach to this data. UEBA and DaBA solutions can provide this visibility and add a new layer to data protection mechanism.