World Library  
Flag as Inappropriate
Email this Article


Article Id: WHEBN0000615703
Reproduction Date:

Title: Steganalysis  
Author: World Heritage Encyclopedia
Language: English
Subject: Steganography, VSL, Detection, Cryptographic attacks, Computer forensics
Publisher: World Heritage Encyclopedia


Steganalysis is the study of detecting messages hidden using steganography; this is analogous to cryptanalysis applied to cryptography.


  • Overview 1
  • Basic techniques 2
  • Advanced techniques 3
    • Noise floor consistency analysis 3.1
  • Further complications 4
    • Encrypted payloads 4.1
    • Barrage noise 4.2
  • Conclusions and further action 5
  • See also 6
  • References 7
  • Bibliography 8
  • External links 9


The goal of steganalysis is to identify suspected packages, determine whether or not they have a payload encoded into them, and, if possible, recover that payload.

Unlike cryptanalysis, where it is obvious that intercepted data contains a message (though that message is encrypted), steganalysis generally starts with a pile of suspect data files, but little information about which of the files, if any, contain a payload. The steganalyst is usually something of a forensic statistician, and must start by reducing this set of data files (which is often quite large; in many cases, it may be the entire set of files on a computer) to the subset most likely to have been altered.

All of the methods of detection described herein are defeated if the unmodified file is used to encrypt/decrypt a file. This may be accomplished by bitwise manipulation of the file using a digital mask to generate the cyphertext. An example of this method is described in a self-published science fiction novel. In that example a file is encrypted using a 1949 photo from a digital archive of National Geographic magazine.[1]

Basic techniques

The problem is generally handled with statistical analysis. A set of unmodified files of the same type, and ideally from the same source (for example, the same model of digital camera, or if possible, the same digital camera; digital audio from a CD MP3 files have been "ripped" from; etc.) as the set being inspected, are analyzed for various statistics. Some of these are as simple as spectrum analysis, but since most image and audio files these days are compressed with lossy compression algorithms, such as JPEG and MP3, they also attempt to look for inconsistencies in the way this data has been compressed. For example, a common artifact in JPEG compression is "edge ringing", where high-frequency components (such as the high-contrast edges of black text on a white background) distort neighboring pixels. This distortion is predictable, and simple steganographic encoding algorithms will produce artifacts that are detectably unlikely.

One case where detection of suspect files is straightforward is when the original, unmodified carrier is available for comparison. Comparing the package against the original file will yield the differences caused by encoding the payload—and, thus, the payload can be extracted.

Advanced techniques

Noise floor consistency analysis

In some cases, such as when only a single image is available, more complicated analysis techniques may be required. In general, steganography attempts to make distortion to the carrier indistinguishable from the carrier's noise floor. In practice, however, this is often improperly simplified to deciding to make the modifications to the carrier resemble white noise as closely as possible, rather than analyzing, modeling, and then consistently emulating the actual noise characteristics of the carrier. In particular, many simple steganographic systems simply modify the least-significant bit (LSB) of a sample; this causes the modified samples to have not only different noise profiles than unmodified samples, but also for their LSBs to have different noise profiles than could be expected from analysis of their higher-order bits, which will still show some amount of noise. Such LSB-only modification can be detected with appropriate algorithms, in some cases detecting encoding densities as low as 1% with reasonable reliability.[2]

Further complications

Encrypted payloads

Detecting a probable steganographic payload is often only part of the problem, as the payload may have been encrypted first. Encrypting the payload is not always done solely to make recovery of the payload more difficult. Most strong ciphers have the desirable property of making the payload appear indistinguishable from uniformly-distributed noise, which can make detection efforts more difficult, and save the steganographic encoding technique the trouble of having to distribute the signal energy evenly (but see above concerning errors emulating the native noise of the carrier).

Barrage noise

If inspection of a storage device is considered very likely, the steganographer may attempt to barrage a potential analyst with, effectively, misinformation. This may be a large set of files encoded with anything from random data, to white noise, to meaningless drivel, to deliberately misleading information. The encoding density on these files may be slightly higher than the "real" ones; likewise, the possible use of multiple algorithms of varying detectability should be considered. The steganalyst may be forced into checking these decoys first, potentially wasting significant time and computing resources. The downside to this technique is it makes it much more obvious that steganographic software was available, and was used.

Conclusions and further action

Obtaining a warrant or taking other action based solely on steganalytic evidence is a very dicey proposition unless a payload has been completely recovered and decrypted, because otherwise all the analyst has is a statistic indicating that a file may have been modified, and that modification may have been the result of steganographic encoding. Because this is likely to frequently be the case, steganalytic suspicions will often have to be backed up with other investigative techniques.

See also


  1. ^ "Timelines", by Bob Blink, Amazon Kindle, Page 189
  2. ^ Patent No. 6,831,991, Reliable detection of LSB steganography in color and grayscale images; Fridrich, Jessica, et al., issued December 14th, 2004. (This invention was made with Government support under F30602-00-1-0521 and F49620-01-1-0123 from the U.S. Air Force. The Government has certain rights in the invention.)


  • Geetha, S; Siva S.Sivatha Sindhu (October 2009). "Blind image steganalysis based on content independent statistical measures maximizing the specificity and sensitivity of the system.". Computers and Security. Elsevier, Science Direct 28 (7): 683–697.  
  • Geetha, S; Dr.N.Kamaraj (July 2010). "Evolving decision tree rule based system for audio stego anomalies detection based on Hausdorff distance statistics". Information Sciences. Elsevier, Science Direct 180 (13): 2540–2559.  

External links

  • Steganalysis research and papers by Neil F. Johnson addressing attacks against Steganography and Watermarking, and Countermeasures to these attacks.
  • Research Group. Ongoing research in Steganalysis.
  • Digital Invisible Ink Toolkit An open-source image steganography suite that includes both steganography and steganalysis implementations.
  • Steganography - Implementation and detection Short introduction on steganography, discussing several information sources in which information can be stored
  • StegSecret. StegSecret is a java-based multiplatform steganalysis tool. This tool allows the detection of hidden information by using the most known steganographic methods. It detects EOF, LSB, DCTs and other techniques. (steganography - stegoanalysis).
  • Virtual Steganographic Laboratory (VSL). Contains LSB RS-Analysis and blind (universal) BSM-SVM steganalysis. Application is free, platform-independent graphical block diagramming tool that allows complex using, testing and adjusting of methods both for image steganography and steganalysis. Provides modular, plug-in architecture along with simple GUI.
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.