Freeman-Walter-Abele: A Tortured History of Software Eligibility

This is part 2 of a multi-part series exploring the history of software patents in America. To start reading from the beginning please see The History of Software Patents in the United States. For all of our articles in this series please visit History of Software Patents. 


The so-called Freeman-Walter-Abele test has been defunct for quite some time, dating back to the Federal Circuit doing away with the test in In re Alappat. Nevertheless, the Freeman-Walter-Abele test is quite an important step in the history of software patents in the United States. The reason to spend a significant amount of time discussing the rise and fall of the Freeman-Walter-Abele test is three-fold.

First, the test was widely criticized (rightfully so) as being so flexible that any District Court Judge or three-Judge panel of the Federal Circuit could apply it to justify any preconceived notions and ideological preferences.  Indeed, the Freeman-Walter-Abele test proved to be anything but objective.  The test was unworkable and did not introduce certainty; it introduced unpredictability, which must be avoided at all costs in laws relating to property and in laws relating to business.

The second reason to focus on the Freeman-Walter-Abele test is because there is no way to ignore the fact that the more recent tests in the software space are best characterized as versions of the Freeman-Walter-Abele test in disguise.  Under the Freeman-Walter-Abele test there needed to be some physical, tangible link to the process steps, which looks eerily like the machine component of the Bilski machine or transformation test. Furthermore, under the Freeman-Walter-Abele test it is not enough that the patent claim be drafted as a method, but rather the process must be linked to one or more elements of a statutory apparatus claim that itself would meet the requirements of section 101. The similarity with the machine-or-transformation test is again striking.

The influence of the thinking behind Freeman-Walter-Abele can also be seen in the Supreme Court’s decision in Alice. Thanks to Alice the focus is now on whether the claims cover an abstract idea or concept, and in order to make the determination we are not supposed to look at the language of the claims, but rather to look through the claims. This causes the apparatus claims to rise and fall with the method claims despite the fact that machines are clearly patent eligible according to the terms of the statute. Further, as the law associated with software developed the industry, with good reason, thought that it would be enough to say that the process steps had to be carried out on a machine (i.e., a computer). That clearly isn’t enough after Alice. While the Supreme Court hasn’t adopted the Freeman-Walter-Abele test, and the current articulation of the test is couched as whether the claims cover only an abstract idea, it does seem that if patent claims could be written to satisfy the moving target of the FWA test then the patent claims should work to satisfy the Alice test that adopts the Mayo framework.

Finally, as the Court of Customs and Patent Appeals and then the Federal Circuit struggled with In re FreemanIn re Walter and In re Abele, they were trying to reconcile seemingly irreconcilable Supreme Court precedent in Gottschalk v. BensonParker v. Flook, and Diamond v. Diehr. Prior to the most recent patent eligibility cases many would have likely, and correctly, believed that after Diehr there was no way that Benson and Flook could possibly remain good law. But Flook has reemerged within patent eligibility cases dealing with software like a Phoenix from the ashes. Thus, the trio of cases that make up the Freeman-Walter-Abele test deserve heightened attention. Although the issues we are dealing with today are analytically different, there is an eerie similarity between the rational used to find processes patent ineligible under the mathematical algorithm approach then and the abstract idea approach today.

Therefore, for practitioners looking for guidance on how to draft software patent claims, the Freeman-Walter-Abele test will likely provide important insights. I will address each case in chronological order.



In re Freeman (CCPA, 1978)

The patent claims in dispute in this appeal to the CCPA related to a system for typesetting alphanumeric information, which utilized a computer-based control system in conjunction with a conventional phototypesetter.  The patent examiner rejected all of the claims as not being supported by the disclosure, and specifically rejected claims 8-10 as being drawn to nonstatutory subject matter because they represented mental steps. The Board reversed both rejections, but entered its own new ground of rejection. In applying the Supreme Court’s guidance from Gottschalk v. Benson, the Board determined that the claimed improvement had  no substantial practical application except in connection with a digital computer and, therefore, granting the patent claim would be tantamount to granting a patent on an algorithm itself.

In a nutshell, the CCPA, with Chief Judge Markey delivering the opinion, explained that the Board read the Supreme Court’s decision in Benson too broadly. In explaining the error of the Board, Chief Judge Markey explained:

The fundamental flaw in the board’s analysis in this case lies in a superficial treatment of the claims. With no reference to the nature of the algorithm involved, the board merely stated that the coverage sought “in practical effect would be a patent on the algorithm itself.” Though the board gave no clear reasons for so concluding, its approach would appear to be that every implementation with a programmed computer equals “algorithm” in the Benson sense. If that rubric be law, every claimed method that can be so implemented would equal nonstatutory subject matter under 35 USC 101. That reasoning sweeps too wide and is without basis in law. The absence, or inadequacy, of detailed claim analysis in the present case is further illustrated by the conclusion that “the novelty resides in the program” when, as here, the claims recite no particular computer program. In the present case, it is not the claims but the specification that discloses implementation of the claimed invention with computer programs.

Chief Judge Markey then announced the first formulation of what was to become known as the Freeman-Walter-Abele test. Markey set forth the test in this way:

Determination of whether a claim preempts nonstatutory subject matter as a whole, in the light of Benson, requires a two-step analysis. First, it must be determined whether the claim directly or indirectly recites an “algorithm” in the Benson sense of that term, for a claim which fails even to recite an algorithm clearly cannot wholly preempt an algorithm. Second, the claim must be further analyzed to ascertain whether in its entirety it wholly preempts that algorithm.

Markey would then go on to explain that only the first prong would be discussed by the court because the claims did not recite an algorithm at all, let alone preempt any or all particular uses of an algorithm. Chief Judge Markey attributed this error to an overbroad understanding of of the term “algorithm.” Markey distinguished the algorithm the Supreme Court dealt with in Benson, characterizing it as “a procedure for solving a given type of mathematical problem.” More broadly, however, an algorithm can be defined as “a step-by-step procedure for solving a problem or accomplishing some end.” In short, the definition of algorithm employed by the Board would have rendered all processes patent ineligible as algorithms despite the fact that the statute clearly allows for processes to be patented.


In re Walter (CCPA, 1980)

The patent claims in issue in this case related to an invention used in seismic prospecting and surveying. The applicant invented a method and apparatus for performing the cross-correlation of returning jumbled signals. The invention effectively unscrambled the returning signals. In order to carry out the method several mathematical operations were performed, including computing Forurier transforms and cross-correlation unitizing a modification of the Cooley-Tukey algorithm.

The patent examiner rejected the claims because they were directed to the mathematical procedure. The Board affirmed the rejection and pointed out that the distinction between the method and apparatus claims were of no significance because it would be anomalous to grant apparatus claims encompassing the means for practicing the method claimed because the method was determined to be patent ineligible. This type of circular logic where machine and apparatus claims rise and fall not on their own merit but rather on the merit of method claims is quite familiar today given that the Supreme Court has essentially instructed patent examiners and Judges to ignore the language of the claims and determine what the invention is and then reject all claims no matter how they are drafted.

Judge Giles Sutherland Rich delivered the opinion for the CCPA, starting his analysis by recognizing that determining patent eligibility “has proved to be one of the most difficult and controversial issues in patent law.” He then went to point out that the question in this case isn’t about the patent eligibility of computer related inventions per se, but rather about the patentability of mathematical-related inventions.

Of particular interest is the exceptionally lucid explanation of a computer and how it relates to tangible machines. Judge Rich wrote:

A computer is nothing more than an electronic machine. It is characterized by its ability to process data, usually by executing mathematical operations on the data at high speeds. By virtue of the speed with which computers operate, they are capable of executing complex or otherwise time-consuming calculations in fractions of a second. Their use in technology is analogous to the use of mechanical devices, such as levers, which provide mechanical advantage in inventions of a mechanical nature; they make possible, or practicable, the solution of mathematical problems which are impractical to solve manually due to the inordinate amount of time manual solution would consume.

Judge Rich also declined the Solicitor’s invitation to read a point of novelty requirement into Section 101, whereby the claimed method would be patent ineligible if the mathematical algorithm were at the point of novelty. Judge Rich wrote:

If this approach were to be adopted it would immeasurably debilitate the patent system. We do not believe the Supreme Court has acted in a manner so potentially destructive. As an illustration of the utter failure of such an approach to resolve these questions, we offer the example of certain improvement inventions, wherein the improvement resides in the application of scientific truth, e.g., mathematical formulac, to previously known structure or process steps.

Improvement inventions are expressly included within §101, which provides that “Whoever invents any * * * new and useful improvement [of a process, machine, manufacture, or composition of matter] may obtain a patent therefor * * *.”  There is no evidence that Congress intended a different criterion to apply to improvement inventions to determine whether they are statutory. Yet a strict “point of novelty” approach to improvement inventions involving the application of scientific truth as the improvement would effectively place them, as a class, outside the coverage of §101 – and to no purpose.

Judge Rich also pointed out that inserting a novelty requirement into Section 101 would fly in the face of the Supreme Court’s decision in Flook, which demanded that the patent claim be considered in totality.

Judge Rich then explained that Freeman is not in conflict with Flook, and announced a corollary to the Freeman test that takes into account whether mathematica algorithms are preempted. He wrote:

In order to determine whether a mathematical algorithm is “preempted” by a claim under Freeman, the claim is analyzed to establish the relationship between the algorithm and the physical steps or elements of the claim…

When this court has heretofore applied its Freeman test, it has viewed it as requiring that the claim be examined to determine the significance of the mathematical algorithm, i.e., does the claim implement the algorithm in a specific manner to define structural relationships between the elements of the claim in the case of apparatus claims, or limit or refine physical process steps in the case of process or method claims? The point of the analysis is the recognition that “A principle, in the abstract, is a fundamental truth; an original cause; a motive; these cannot be patented,” Le Roy v. Tatham, 55 U.S. at 175, and that, “a hitherto unknown phenomenon of nature” if claimed would not be statutory, but that “the application of the law of nature to a new and useful end,” Funk Bros., 333 U.S. at 130 would be.

Ultimately, Judge Rich determined that the claims in question covered the mathematical algorithm itself. Judge Rich did also recognize that most of his claims limit were limited by their terms to a particular technology, but that did not save them. “This may not be done under the patent law as it now exists,” Rich wrote. In any event, the relationship between the algorithm and a tangible manifestation (i.e., physical steps) has been introduced.


In re Abele (Federal Circuit, 1982)

The invention in question in this case related to the field of image processing, particularly image processing as applied to computerized axial tomography or CAT scans. The invention was an improvement in computer tomography that limited X-ray exposure while still reliably producing an improved image. Essentially, the applicants discovered that the spread of X-rays can be reduced, which not only minimizes the patients exposure, but also reduces the computer calculation time. Through the use of a weighting function in the calculations artifacts are eliminated, which produces an improved image over conventional techniques.

Applying Flook, the examiner rejected the claims because apart from the mathematical calculations the remaining steps were well known and, therefore, could not contribute to the existence of statutory subject matter. The Board did not rely on the examiner’s rejection. Instead, without relying on the claim language itself, the Board affirmed the rejection saying that “the mathematical algorithm is not implemented in a manner to define structural relationships between physical elements in the apparatus claims or to refine or limit claim steps in the process claims.” Therefore, the Board concluded that the claims do nothing more than “solve a mathematical algorithm and are manifestly nonstatutory.”

After summarizing the relevant precedential cases, with emphasis on the Supreme Court’s decisions in Benson and Flook, the Federal Circuit, with Judge Nies writing for an expanded panel that included Chief Judge Markey, Judge Rich, Judge Baldwin and Judge Miller, explained:

In sum, the Court’s decisions have made clear that a claim does not present patentable subject matter if it would wholly preempt an algorithm, Benson, supra, or if it would preempt the algorithm but for limiting its use to a particular technological environment, Flook, supra. However, these decisions leave undefined what does constitute statutory subject matter.

This succinctly puts the problem in context. The Supreme Court has been very good about explaining what is not patent eligible, which is typically whatever claim they are reviewing at the moment, but what remains unclear after practically every Supreme Court treatment of the issue is what does constitute patent eligible subject matter. Even in Diamond v. Diehr, the Supreme Court did not really elaborate upon how and why the software in the patent claims was patent eligible, rather choosing to state the test in the negative, recognizing that “a process is not unpatentable simply because it contains a law of nature or a mathematical algorithm.” But how and when is a claim that contains a mathematical algorithm patentable?

With no real help from the Supreme Court to define what is patent eligible with respect to computer implemented methods, Judge Nies explained the holding in Walter in this way:

Walter should be read as requiring no more than that the algorithm be “applied in any manner to physical elements or process steps,” provided that its application is circumscribed by more than a field of use limitation or non-essential post-solution activity. Thus, if the claim would be “otherwise statutory,” id., albeit inoperative or less useful without the algorithm, the claim likewise presents statutory subject matter when the algorithm is included. This broad reading of Walter, we conclude, is in accord with the Supreme Court decisions.

Thus, in order for a patent claim to a computer implemented process to be patent eligible after Abele, there needs to be some real world analogous process that is improved by the presence of an algorithm.

Turning to the claims, the Federal Circuit split the baby in some respects. The broadest process claims were claim 5 and claim 6, which read:

5. A method of displaying data in a field comprising the steps of calculating the difference between the local value of the data at a data point in the field and the average value of the data in a region of the field which surrounds said point for each point in said field, and displaying the value of said difference as a signed gray scale at a point in a picture which corresponds to said data point.

6. The method of claim 5 wherein said data is X-ray attenuation data produced in a two dimensional field by a computed tomography scanner.

The Federal Circuit explained that claim 5 merely covers the calculation of a number and display of the result, although in a particular format. Claim 6, however, adds the limitation that the “data is X-ray attenuation data…” The Federal Circuit explained that such attenuation data is only available when the X-ray has passed through an object and is then detected upon its exit. Thus, claim 6 presented data gathering steps not dictated by the algorithm. Therefore, claim 6 and all the claims dependent on claim 6 were deemed to define patent eligible subject matter.


CLICK HERE to CONTINUE READING… In part 3 we discuss the next step in the evolution of the law relative to software patents, namely when the Federal Circuit was faced with deciding whether a computer implemented method that transformed data into a readable waveform that could be quickly interpreted was patent eligible. This occurred in Arrhythmia Research and Alappat.


Warning & Disclaimer: The pages, articles and comments on do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of

Join the Discussion

6 comments so far.

  • [Avatar for A Rational Person]
    A Rational Person
    December 4, 2014 11:50 am

    Gene @5

    The problem is that you have Supreme Court justices who are willfully ignorant of technology, are arrogant, make up law out of thin air and show no appreciation for the consequences of their decisions.

    For a example of how a decision with respect to patent eligibility should be done, I highly recommend reading Frankfurter’s concurring opinion in Funk Bros. For example, the following passage in Frankfurter’s concurring opinion show wisdom and foresight totally lacking in any recent decision by the Supreme Court with respect to interpreting 35 USC 101:

    “It only confuses the issue, however, to introduce such terms as ‘the work of nature’ and the ‘laws of nature.’ For these are vague and malleable terms infected with too much ambiguity and equivocation. Everything that happens may be deemed ‘the work of nature,’ and any patentable composite exemplifies in its properties ‘the laws of nature.’ Arguments drawn from such terms for ascertaining patentability could fairly be employed to challenge almost every patent. ”

    The rest of Frankfurter’s concurring opinion also shows a degree of thoughtfulness and understanding of technology totally lacking in recent Supreme Court opinions and oral arguments relating to 35 USC 101.

  • [Avatar for Gene Quinn]
    Gene Quinn
    December 3, 2014 04:40 pm

    Rational Person-

    This who area is hopelessly inconsistent. I find it amazing that CAFC Judges that want to find patent claims ineligible can cite to only Supreme Court precedent that does logically support their holdings, while at the same time CAFC Judges that want to find patent claims eligible can similarly cite to the same Supreme Court precedent and still logically support their holdings as well. The law is broken. With such internal inconsistency turmoil and chaos will be the outcome, which is tragic given that this all pertains to the fundamental, threshold inquiry about whether you can get a patent even if the invention described is useful, new, non-obvious and appropriately described in detail.


  • [Avatar for A Rational Person]
    A Rational Person
    December 3, 2014 11:20 am

    Gene @2

    The difficulty the USPTO is having issuing logical and/or satisfactory Myriad-Mayo guidelines provides strong support to your statement that “The [fill in the blank] test is unworkable.”

    In fact, I don’t even see how the USPTO can come up with rational or logical guidelines based on Myriad and Mayo that:

    1. Can provide examples of patent-eligible method claims in chemistry and biotechnology that, based on Myriad and May, can be easily distinguished from patent ineligible method claims; and
    2. Would allow gunpowder to be a patent eligible invention if it were invented today.

    Witness the fact that in the first set of proposed guidelines, gunpowder was considered to occur in nature (see slides 54 and 55 of the March 19, 2014 Myriad-Mayo guidelines)

  • [Avatar for Anon]
    December 2, 2014 04:18 pm

    Please pardon my invocation of history, but the Supreme Court has – in fact – turned 35 USC 101 into a nose of wax – but worse, they have so bent and abused the nose, that it has fallen off the face of Justice (and at the risk of mixing eras, much like the Egyptian Sphinx, stands guard without a nose).

  • [Avatar for Gene Quinn]
    Gene Quinn
    December 2, 2014 03:56 pm


    The [fill in the blank] test is unworkable.

    Yes, that is exactly the point. In revising this series it has been eye opening. The more things have changed the more they stay the same. Frankly, today I don’t know what the test for patent eligibility is. Sure, we use the Mayo framework, but we are operating in a fantasy land where we are told that Benson, Flook, Diehr, Bilski, Mayo and Alice are all consistent. Of course, there is absolutely no logical way that these decisions are in harmony. In fact, Supreme Court precedent on patent eligibility is completely irreconcilable, and that is before you even bring Chakrabarty and Myriad into the discussion.

  • [Avatar for Curious]
    December 2, 2014 12:40 pm

    The test was unworkable and did not introduce certainty; it introduced unpredictability, which must be avoided in at all costs in laws relating to property and in laws relating to business
    Oh, you were talking Freeman-Walter-Abele? I thought you were discussing Bilski-Mayo-Alice. Then again, that was your point, wasn’t it?