Notice: Undefined index: snap in /home/beamreac/public_html/plugins/content/conf/content.conf.inc.php on line 17
Beamreactor CMS software: Towards personal privacy and honeypots.
WYSIWYG web engines: home

ProductsBookmark and Share

Towards personal privacy and honeypots.


A sample scenario to highlight privacy issues and how Beamreactor attempts to deal with these risks.


Everyday, flows of online news, Facebook posts or government agencies tell about the terrible lack of security online. Online data as much as user privacy suddenly became critically at threat, and we are even being told about the potentially critical economic and military risks in case of cyber warfare. Aside of the journalistic sensationalism, we felt like summing up some of the risks and how Beamreactor attempts to deal with problems.


Privacy online

Let's put online security into the right context. Most people won't skip a beat for their social security number can suddenly be found over the internet. They'll be surprised, probably annoyed, then try to 'fix' their computer for potential security holes.

Privacy is an elastic concept. The same people that would be terribly annoyed at being seen naked by strangers, could also totally film themselves nude online. The privacy line becomes often so tight people can't see the dangers linked to loss of privacy anymore. In the lot being seen naked online, many would probably think, 'so what?'.


Data relationships

Of course, they wouldn't get any more nervous if in the movie meta data (that is, the movie technical details) were to be found some of their personal information. Computer, personal, whatever, usually fairly certain nobody would link them to that information, that just millions of people use the same software, webcam, computer settings and or/camera.

Have you ever tried searching the internet for your email or your name?

Some would. A little software called a "harvester" (or robots) would, for example, automatically browse the web for any webpage, sometimes building incredible intrinsic webs of accumulated informations through gigantic databases. Some spiders are specialized into videos and images, and know how to recover the hidden information from within the picture, such as EXIF data, or video, relate it to other online content, either by its format and settings, its name, the context where it has been found, even poorly spelled words.

A common way for online advertising agencies to provide you with the content you search for is to analyze you. Your search words, your behavioral patterns, your online hours mixed with personal settings and online traces. Sometimes the same agencies go deeper, and use spyware and / or harvesters.

For IT professionals, such profiling isn't this hard: most users confidently sitting at home in their armchairs hardly imagine being at threat and spread a huge lot of their personal information.


What could one do with this information anyways?

This is where the plot starts. Say Sam is a robot, and Sam found the same nickname used several times over particular websites on the internet. One of these, a forum, also bears a signature with a link to your website. Your website, or your blog, will feature your name, and if not, facebook or twitter will help.

Sam, after probing its databases with various relational models, know who you are, your favorite websites, your personal website, some of your online friends, sometimes the company you work for and your email. Including that naked video of yours issued as a joke some years ago after a drunken night out, featuring the exact same hardware serial number you can see in another video, this time harmless, posted on your web page.

Still not worried? However, with such information, user identity theft becomes much easier, just like scams, blackmail, oriented spam, defamation and many other "annoyances".


Ok, you got my attention. Why using Beamreactor helps in all this?

Without going too much into details, series of measures have been taken to make the life of web data harvesters a nightmare. Of course, your email is kept hidden at all times from harvesters.  Any meta data of pictures and movies sent through Beamreactor gets wiped, and their checksums changed. The websites use complex redirection methods that not all harvesters follow (but official ones). These robots aren't too clever, and much too curious for their own good, and often fall into built-in Beamreactor traps, where their behavior gets analyzed. These traps are called honeypots.

Most honeypots look like common programming mistakes. We often get comments towards the .php extensions in our URI, it's one of these, that give scripts and harvesters the feeling that it is okay to try and fool this website... That in turn shares the attack information to the full float of other Beamreactor driven websites. To stay on the legal side of honeypotting, we only keep information towards the attack type and method, and spread the attacker IP to the sole Beamreactor enabled websites network.

Here is a list of the most common web harvesters. The 'Risk' rank shows how dangerous a crawler/harvester/robot is, with green being safe and red being a serious threat.

Crawler knowledge base

Because building quality software isn't just making empty shells that looks good, we ensure that not every doubtful software online gets access to your website and online facilities.


Treveur BRETAUDIERE, Beamreactor CEO.



Go back to the content summary