This site graciously hosted
by our friends at




Analysis of Topical Vulnerabilities

09 July 2003

Before we get into our latest vulnerability analysis, we'd like to make a couple of points about this series of articles. First, we're starting out with a few rather simple examples (like the one you're about to read) that most clearly illustrate the points that we're trying to make. We'll tackle more the abstruse stuff later. Second, we'd like you to know that we try always to be "vendor neutral". We don't really care whose ox gets gored, so long as an important concept gets illustrated. That said.

Last week, the Core Security Technologies team published an advisory that described a vulnerability (and patch) in Microsoft's NetMeeting software. The Core advisory is located at http://www.coresecurity.com/common/showdoc.php?idx=352&idxseccion=10. An additional write-up on the vulnerability is also available on SecurityTracker.com. (We encourage you, as always, to read these alerts in their entirety sot that you can get a thorough understanding of the flaw itself.)

We found this vulnerability particularly interesting because it represents a classic example of a security flaw introduced during the actual implementation of the software. We know of no design issue that would lead to this sort of problem directly. Let's take a look.

The Core advisory states, "A directory traversal vulnerability was found in NetMeeting when doing File Transfers. An attacker can use filenames containing "..\..\" when doing a file transfer, and in this manner, create a file in any place of the victim's filesystem, escaping the directory where NetMeeting usually stores incoming files (e.g. C:\Program Files\Received\Received Files)."

Does this sound familiar? In Chapter 4 of Secure Coding (page 119), we cite a file parsing vulnerability that existed in an early 1990s implementation of anonymous FTP on a popular UNIX platform. The details were nearly identical in both cases; only the names of the applications had changed.

In each example, the coding team had taken file specification from the user and had attempted to parse through the input to verify that it referred to files the user was allowed to access. In both cases, the coding team made a critical implementation mistake: they assumed that the filename the user entered was OK if it began with a certain string (e.g., /pub or c:\Program Files\Received\Received Files). Once that initial test was passed, the input was deemed safe. Unfortunately, an attacker could dupe the software into misbehaving by issuing a relative file path that began correctly and ended badly. Examples: /pub/../../etc/passwd or c:\Program Files\Received\Received Files\..\..\..\winnt\).

We see this as a mistake that was made purely at implementation time. The design of both applications may well have otherwise been just fine from a security perspective.

How could these mistakes have been avoided? By careful source code review during the implementation phase, preferably with automated tools. In the early 1990's, when the FTP bug was discovered, none were widely available. We would have had to rely on checklists (or, less reliable still, brainstorming sessions). Today many such tools are available, commercially and through the Open Source community. We think developers should use them. You'll find a few examples in Chapter 6.

Of course, using tools such as static source code analyzers is no guarantee that flaws like this would be caught. Testing shows the presence, not the absence, of flaws! Still, in the "belt and suspenders" spirit that we encourage, using them is easily worthwhile.

Mark G. Graff
Kenneth R. van Wyk
9 July 2003

Copyright (C) 2003, Mark G. Graff and Kenneth R. van Wyk. Permission granted to reproduce and distribute in entirety with credit to authors.


Site Contents Copyright (C) 2002, 2003 Mark G. Graff and Kenneth R. van Wyk. All Rights Reserved.
webmaster@securecoding.org