List of important publications in computer science
This is a list of important publications in computer science, organized by field. Some reasons why a particular publication might be regarded as important:
- Topic creator – A publication that created a new topic
- Breakthrough – A publication that changed scientific knowledge significantly
- Influence – A publication which has significantly influenced the world or has had a massive impact on the teaching of computer science.
Artificial intelligence[edit]
Computing Machinery and Intelligence[edit]
- Alan Turing
- Mind, 59:433–460, 1950.
- Online copy
Description: This paper discusses the various arguments on why a machine can not be intelligent and asserts that none of those arguments are convincing. The paper also suggested the Turing test, which it calls "The Imitation Game" as according to Turing it is pointless to ask whether or not a machine can think intelligently, and checking if it can act intelligently is sufficient.
A Proposal for the Dartmouth Summer Research Project on Artificial Intelligence[edit]
Description: This summer research proposal inaugurated and defined the field. It contains the first use of the term artificial intelligence and this succinct description of the philosophical foundation of the field: "every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it." (See philosophy of AI) The proposal invited researchers to the Dartmouth conference, which is widely considered the "birth of AI". (See history of AI.)
Fuzzy sets[edit]
- Lotfi Zadeh
- Information and Control, Vol. 8, pp. 338–353. (1965).
Description: The seminal paper published in 1965 provides details on the mathematics of fuzzy set theory.
Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference[edit]
- Judea Pearl
- ISBN 1-55860-479-0 Search this book on . Publisher: Morgan Kaufmann Pub, 1988
Description: This book introduced Bayesian methods to AI.
Artificial Intelligence: A Modern Approach[edit]
- Stuart J. Russell and Peter Norvig
- Prentice Hall, Englewood Cliffs, New Jersey, 1995, ISBN 0-13-080302-2 Search this book on .
- Textbook's website
Description: The standard textbook in Artificial Intelligence. The book web site lists over 1100 colleges.
Machine learning[edit]
An Inductive Inference Machine[edit]
- Ray Solomonoff
- IRE Convention Record, Section on Information Theory, Part 2, pp. 56–62, 1957
- (A longer version of this, a privately circulated report, 1956, is online).
Description: The first paper written on machine learning. Emphasized the importance of training sequences, and the use of parts of previous solutions to problems in constructing trial solutions to new problems.
Language identification in the limit[edit]
- E. Mark Gold
- Information and Control, 10(5):447–474, 1967
- Online version: (HTML) (PDF)
Description: This paper created Algorithmic learning theory.
On the uniform convergence of relative frequencies of events to their probabilities[edit]
- V. Vapnik, A. Chervonenkis
- Theory of Probability and Its Applications, 16(2):264—280, 1971
Description: Computational learning theory, VC theory, statistical uniform convergence and the VC dimension. While the bounds in this paper are not the best possible, it would take 50 years before the best bound was obtained by Michael Naaman in 2021.[1]
A theory of the learnable[edit]
- Leslie Valiant
- Communications of the ACM, 27(11):1134–1142 (1984)
Description: The Probably approximately correct learning (PAC learning) framework.
Learning representations by back-propagating errors[edit]
- David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams
- Nature, 323, 533–536, 1986
Description: Seppo Linnainmaa's reverse mode of automatic differentiation[2][3] (first applied to neural networks by Paul Werbos[4]) is used in experiments by David Rumelhart, Geoff Hinton and Ronald J. Williams to learn internal representations.
Induction of Decision Trees[edit]
- J.R. Quinlan
- Machine Learning, 1. 81—106, 1986.
Description: Decision Trees are a common learning algorithm and a decision representation tool. Development of decision trees was done by many researchers in many areas, even before this paper. Though this paper is one of the most influential in the field.
Learning Quickly When Irrelevant Attributes Abound: A New Linear-threshold Algorithm[edit]
- Nick Littlestone
- Machine Learning 2: 285–318, 1988
- Online version(PDF)
Description: One of the papers that started the field of on-line learning. In this learning setting, a learner receives a sequence of examples, making predictions after each one, and receiving feedback after each prediction. Research in this area is remarkable because (1) the algorithms and proofs tend to be very simple and beautiful, and (2) the model makes no statistical assumptions about the data. In other words, the data need not be random (as in nearly all other learning models), but can be chosen arbitrarily by "nature" or even an adversary. Specifically, this paper introduced the winnow algorithm.
Learning to predict by the method of Temporal difference[edit]
- Richard S. Sutton
- Machine Learning 3(1): 9–44
- Online copy(PDF)
Description: The Temporal difference method for reinforcement learning.
Learnability and the Vapnik–Chervonenkis dimension[edit]
- A. Blumer
- A. Ehrenfeucht
- D. Haussler
- M. K. Warmuth
- Journal of the ACM, 36(4):929–965, 1989.
Description: The complete characterization of PAC learnability using the VC dimension.
Cryptographic limitations on learning boolean formulae and finite automata [edit]
- M. Kearns
- L. G. Valiant
- In Proceedings of the 21st Annual ACM Symposium on Theory of Computing, pages 433–444, New York. ACM.
Description: Proving negative results for PAC learning.
The strength of weak learnability[edit]
- Robert E. Schapire
- Machine Learning, 5(2):197–227, 1990.
- Online version(PDF)
Description: Proving that weak and strong learnability are equivalent in the noise free PAC framework. The proof was done by introducing the boosting method.
A training algorithm for optimum margin classifiers[edit]
- Bernhard E. Boser
- Isabelle M. Guyon
- Vladimir N. Vapnik
- Proceedings of the Fifth Annual Workshop on Computational Learning Theory 5 144–152, Pittsburgh (1992).
- Online version(HTML)
Description: This paper presented support vector machines, a practical and popular machine learning algorithm. Support vector machines often use the kernel trick.
A fast learning algorithm for deep belief nets[edit]
- Geoffrey E. Hinton
- Simon Osindero
- Yee-Whye Teh
- Neural Computation (2006)
- Online PDF
Description: This paper presented a tractable greedy layer-wise learning algorithm for deep belief networks, which led to great advancement in the field of deep learning.
Knowledge-based analysis of microarray gene expression data by using support vector machines[edit]
- MP Brown
- WN Grundy
- D Lin
- Nello Cristianini
- CW Sugnet
- TS Furey
- M Ares Jr
- David Haussler
- PNAS, 2000 January 4;97(1):262–7
Description: The first application of supervised learning to gene expression data, in particular Support Vector Machines. The method is now standard, and the paper one of the most cited in the area.
Attention Is All You Need[edit]
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan Gomez, Lukasz Kaiser, and Illia Polosukhin
- Advances in Neural Information Processing Systems
Description: Introduced the transformer.
Compilers[edit]
On the translation of languages from left to right[edit]
- Knuth, D. E. (July 1965). "On the translation of languages from left to right". Information and Control. 8 (6): 607–639. doi:10.1016/S0019-9958(65)90426-2.
Description: LR parser, which does bottom up parsing for deterministic context-free languages. Later derived parsers, such as the LALR parser, have been and continue to be standard practice, such as in Yacc and descendants.[5]
Semantics of Context-Free Languages.[edit]
- Donald Knuth
- Math. Systems Theory 2:2 (1968), 127–145.
Description: About grammar attribution, the base for yacc's s-attributed and zyacc's LR-attributed approach.
Compiler Construction for Digital Computers[edit]
- Gries, David (1971). Compiler Construction for Digital Computers (in English, Spanish, Japanese, Chinese, Italian, and Russian). New York: John Wiley and Sons. ISBN 0-471-32776-X.CS1 maint: Unrecognized language (link) Search this book on
Description: The first textbook on compilers. Dick Grune says that entire generations of compiler constructors grew up with it and have not regretted it. One of the first texts to be produced using punched cards input to a formatting program. The cards and formatting program are preserved at the Stanford Computer History Exhibits. See David Gries#Textbooks for links.
A Unified Approach to Global Program Optimization[edit]
- Gary Kildall
- Proceedings of ACM SIGACT-SIGPLAN 1973 Symposium on Principles of Programming Languages.
Description: Formalized the concept of data-flow analysis as fixpoint computation over lattices, and showed that most static analyses used for program optimization can be uniformly expressed within this framework.
A program data flow analysis procedure[edit]
- Frances E. Allen, J. Cocke
- Commun. ACM, 19, 137–147, 1976
Description: From the abstract: "The global data relationships in a program can be exposed and codified by the static analysis methods described in this paper. A procedure is given which determines all the definitions which can possibly reach each node of the control flow graph of the program and all the definitions that are live on each edge of the graph."
YACC: Yet another compiler-compiler[edit]
- Stephen C. Johnson
- Unix Programmer's Manual Vol 2b, 1979
- Online copy (HTML) Archived 2011-07-11 at the Wayback Machine
Description: Yacc is a tool that made compiler writing much easier.
gprof: A Call Graph Execution Profiler[edit]
- Susan L. Graham, Peter B. Kessler, Marshall Kirk McKusick
- Proceedings of the ACM SIGPLAN 1982 Symposium on Compiler Construction, SIGPLAN Notices 17, 6, Boston, MA. June 1982.
- Online copy; pdf
Description: The gprof profiler
Compilers: Principles, Techniques and Tools [edit]
- Alfred V. Aho
- Ravi Sethi
- Jeffrey D. Ullman
- Monica S. Lam (Co-author of the second edition, 2006)
- Addison-Wesley, 1986. ISBN 0-201-10088-6 Search this book on .
Description: This book became a classic in compiler writing. It is also known as the Dragon book, after the (red) dragon that appears on its cover.
Computer architecture[edit]
Colossus computer[edit]
- T. H. Flowers
- Annals of the History of Computing, Vol. 5 (No. 3), 1983, pp. 239–252.
- The Design of Colossus
Description: The Colossus machines were early computing devices used by British codebreakers to break German messages encrypted with the Lorenz Cipher during World War II. Colossus was an early binary electronic digital computer. The design of Colossus was later described in the referenced paper.
First Draft of a Report on the EDVAC[6][edit]
- John von Neumann
- June 30, 1945, the ENIAC project.
- First Draft of a report on the EDVAC (PDF)
Description: It contains the first published description of the logical design of a computer using the stored-program concept, which has come to be known as the von Neumann architecture. See First Draft of a Report on the EDVAC.
Architecture of the IBM System/360[edit]
- Gene Amdahl, Fred Brooks, G. A. Blaauw
- IBM Journal of Research and Development, 1964.
- Architecture of the IBM System/360
Description: The IBM System/360 (S/360) is a mainframe computer system family announced by IBM on April 7, 1964. It was the first family of computers making a clear distinction between architecture and implementation.
The case for the reduced instruction set computer[edit]
- DA Patterson, DR Ditzel
- Computer ArchitectureNews, vol. 8, no. 6, October 1980, pp 25–33.
- Online version(PDF)
Description: The reduced instruction set computer(RISC) CPU design philosophy. The RISC is a CPU design philosophy that favors a reduced set of simpler instructions.
Comments on "the Case for the Reduced Instruction Set Computer"[edit]
- DW Clark, WD Strecker
- Computer Architecture News, 1980.
- Online version(PDF)
Description:
The CRAY-1 Computer System[edit]
- RM Russell
- Communications of the ACM, January 1978, volume 21, number 1, pages 63–72.
- Online version(PDF)
Description: The Cray-1 was a supercomputer designed by a team including Seymour Cray for Cray Research. The first Cray-1 system was installed at Los Alamos National Laboratory in 1976, and it went on to become one of the best known and most successful supercomputers in history.
Validity of the Single Processor Approach to Achieving Large Scale Computing Capabilities[edit]
- Gene Amdahl
- AFIPS 1967 Spring Joint Computer Conference, Atlantic City, N.J.
- Online version(PDF)
Description: The Amdahl's Law.
A Case for Redundant Arrays of Inexpensive Disks (RAID)[edit]
- David A. Patterson, Garth Gibson, Randy H. Katz
- In International Conference on Management of Data, pages 109–116, 1988.
- Online version(PDF)
Description: This paper discusses the concept of RAID disks, outlines the different levels of RAID, and the benefits of each level. It is a good paper for discussing issues of reliability and fault tolerance of computer systems, and the cost of providing such fault-tolerance.
The case for a single-chip multiprocessor[edit]
- Kunle Olukotun, Basem Nayfeh, Lance Hammond, Ken Wilson, Kunyung Chang
- In SIGOPS Oper. Syst. Rev. 30, pages 2–11, 1996.
Description: This paper argues that the approach taken to improving the performance of processors by adding multiple instruction issue and out-of-order execution cannot continue to provide speedups indefinitely. It lays out the case for making single chip processors that contain multiple "cores". With the mainstream introduction of multicore processors by Intel in 2005, and their subsequent domination of the market, this paper was shown to be prescient.
Computer graphics[edit]
The Rendering Equation[edit]
- J. Kajiya
- SIGGRAPH: ACM Special Interest Group on Computer Graphics and Interactive Techniques pages 143—150[7]
Elastically deformable models[edit]
- Demetri Terzopoulos, John Platt, Alan Barr, Kurt Fleischer
- Computer Graphics, 21(4), 1987, 205–214, Proc. ACM SIGGRAPH'87 Conference, Anaheim, CA, July 1987.
- Online version(PDF)
Description: The Academy of Motion Picture Arts and Sciences cited this paper as a "milestone in computer graphics".
Sketchpad, a Man-Machine Graphical Communication System[edit]
Description: One of the founding works on computer graphics.
Computer vision[edit]
The Phase Correlation Image Alignment Method [edit]
- C.D. Kuglin and D.C. Hines
- IEEE 1975 Conference on Cybernetics and Society, 1975, New York, pp. 163–165, September
Description: A correlation method based upon the inverse Fourier transform
Determining Optical Flow[edit]
- Berthold K.P. Horn and B.G. Schunck
- Artificial Intelligence, Volume 17, 185–203, 1981
- OA article here: doi:10.1016/0004-3702(81)90024-2
Description: A method for estimating the image motion of world points between 2 frames of a video sequence.
An Iterative Image Registration Technique with an Application to Stereo Vision[edit]
- Lucas, B.D. and Kanade, T.
- Proceedings of the 7th International Joint Conference on Artificial Intelligence, 674–679, Vancouver, Canada, 1981
Description: This paper provides efficient technique for image registration
The Laplacian Pyramid as a compact image code[edit]
- Peter J. Burt and Edward H. Adelson
- IEEE Transactions on Communications, volume = "COM-31,4", pp. 532–540, 1983.
Description: A technique for image encoding using local operators of many scales.
Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images[edit]
- Stuart Geman and Donald Geman
- IEEE Transactions on Pattern Analysis and Machine Intelligence, 1984
Description: Introduced (1) MRFs for image analysis, (2) the Gibbs sampling which revolutionized computational Bayesian statistics and thus had paramount impact in many other fields in addition to Computer Vision.
Snakes: Active contour models[edit]
- Michael Kass, Andrew Witkin, and Demetri Terzopoulos[8]
- International Journal of Computer Vision, 1(4):321–331, 1987 (Marr Prize Special Issue).
Description: An interactive variational technique for image segmentation and visual tracking.
Condensation – conditional density propagation for visual tracking[edit]
- M. Isard and A. Blake
- International Journal of Computer Vision, 29(1):5–28, 1998.
Description: A technique for visual tracking
Object recognition from local scale-invariant features [edit]
- David Lowe
- International Conference on Computer Vision, pp. 1150–1157, 1999
Description: A technique (scale-invariant feature transform) for robust feature description
Concurrent, parallel, and distributed computing[edit]
Topics covered: concurrent computing, parallel computing, and distributed computing.
Databases[edit]
[edit]
- E. F. Codd
- Communications of the ACM, 13(6):377–387, June 1970
Description: This paper introduced the relational model for databases. This model became the number one model.
Binary B-Trees for Virtual Memory[edit]
- Rudolf Bayer
- ACM-SIGFIDET Workshop 1971, San Diego, California, Session 5B, p. 219–235.
Description: This paper introduced the B-Trees data structure. This model became the number one model.
Relational Completeness of Data Base Sublanguages[edit]
- E. F. Codd
- In: R. Rustin (ed.): Database Systems: 65–98, Prentice Hall and IBM Research Report RJ 987, San Jose, California : (1972)
- Online version (PDF) Archived 2007-06-15 at the Wayback Machine
Description: Completeness of Data Base Sublanguages
The Entity Relationship Model – Towards a Unified View of Data[edit]
Description: This paper introduced the entity-relationship diagram (ERD) method of database design.
SEQUEL: A structured English query language[edit]
- Donald D. Chamberlin, Raymond F. Boyce
- International Conference on Management of Data, Proceedings of the 1974 ACM SIGFIDET (now SIGMOD) workshop on Data description, access and control, Ann Arbor, Michigan, pp. 249–264
Description: This paper introduced the SQL language.
The notions of consistency and predicate locks in a database system[edit]
- Kapali P. Eswaran, Jim Gray, Raymond A. Lorie, Irving L. Traiger
- Communications of the ACM 19, 1976, 624—633
Description: This paper defined the concepts of transaction, consistency and schedule. It also argued that a transaction needs to lock a logical rather than a physical subset of the database.
Federated database systems for managing distributed, heterogeneous, and autonomous databases[edit]
- Amit Sheth, J.A. Larson,"
- ACM Computing Surveys - Special issue on heterogeneous databases Surveys, Volume 22 Issue 3, Pages 183 - 236, Sept. 1990
- ACM source
Description: Introduced federated database systems concept leading huge impact on data interoperability and integration of hetereogeneous data sources.
Mining association rules between sets of items in large databases[edit]
- Rakesh Agrawal, Tomasz Imielinski, Arun Swami
- Proc. of the ACM SIGMOD Conference on Management of Data, pages 207–216, Washington, D.C., May 1993
Description: Association rules, a very common method for data mining.
History of computation[edit]
The Computer from Pascal to von Neumann[edit]
- Goldstine, Herman H. (1972). The Computer from Pascal to von Neumann. Princeton University Press. ISBN 978-0-691-08104-5. Search this book on
Description: Perhaps the first book on the history of computation.
A History of Computing in the Twentieth Century[edit]
edited by:
- Nicholas Metropolis
- J. Howlett
- Gian-Carlo Rota
- Academic Press, 1980, ISBN 0-12-491650-3 Search this book on .
Description: Several chapters by pioneers of computing.
Information retrieval[edit]
A Vector Space Model for Automatic Indexing[edit]
- Gerard Salton, A. Wong, C. S. Yang
- Commun. ACM 18(11): 613–620 (1975)
Description: Presented the vector space model.
Extended Boolean Information Retrieval[edit]
- Gerard Salton, Edward A. Fox, Harry Wu
- Commun. ACM 26(11): 1022–1036 (1983)
Description: Presented the inverted index
A Statistical Interpretation of Term Specificity and Its Application in Retrieval[edit]
- Karen Spärck Jones
- Journal of Documentation 28: 11–21 (1972). doi:10.1108/eb026526.
Description: Conceived a statistical interpretation of term specificity called Inverse document frequency (IDF), which became a cornerstone of term weighting.
Networking[edit]
A Protocol for Packet Network Intercommunication[edit]
- Vint Cerf, Robert Kahn
- IEEE Transactions on Communications, 1974.
- Online copy (PDF)
Description: This paper contains a lot of the ideas that later became TCP and IP, two foundational protocols that make up the Internet. Cerf and Kahn received the ACM Turing Award, in part for the work contained in this paper.
The Design Philosophy of the DARPA Internet Protocols[edit]
- David Clark
- ACM SIGCOMM Computer Communications Review, Vol. 18, No. 4, pp. 106–114, August 1988.
- Online copy (PDF)
Description: This paper describes some of the design principles behind the Internet, and how those design principles are realized in the Internet.
End-To-End Arguments in System Design[edit]
- Lua error in Module:Citeq at line 53: attempt to index field 'wikibase' (a nil value)., Wikidata Q56503280
Description: This paper presents the "end-to-end argument", a classic design principle widely used to guide the design of many of the Internet's protocols and systems.
Congestion Avoidance and Control[edit]
- Van Jacobson, Michael J. Karels
- ACM SIGCOMM, 1988.
- Online copy (HTML)
Description: This paper identifies the problem of network congestion, and presents an algorithm for how protocols can reduce their sending rate to reduce congestion. This approach was incorporated into the TCP protocol, and influenced design of many other data transport protocols.
Analysis and Simulation of a Fair Queuing Algorithm[edit]
- Alan Demers, Srinivasan Keshav, Scott Shenker
- ACM SIGCOMM CCR, Vol. 19, No. 4, September 1989.
- Online copy (PDF)
Description: This paper presents "fair queuing", a buffer allocation algorithm nearly universally deployed on Internet routers.
Scalable High Speed IP Routing Lookups[edit]
- M. Waldvogel, G. Varghese, J. Turner, B. Plattner
- ACM SIGCOMM, August 1997.
- Online copy (PDF)
Description: This paper describes an algorithmic approach to finding the prefix (supernet) containing a particular IP address, a process which is now nearly universally-used on Internet routers.
Chord: A Scalable Peer-to-peer Lookup Service for Internet Applications[edit]
- Ion Stoica, Robert Morris, David Karger, M. Frans Kaashoek, Hari Balakrishnan
- ACM SIGCOMM, August 2001
- Online copy (PDF)
Description: This paper presents the concept of a Distributed Hash Table (DHT), a distributed data structure that had influenced the design of a number of peer-to-peer systems, distributed filesystems, and other large-scale distributed systems.
Also see the "Top Ten Networking Papers" lists published in ACM SIGCOMM CCR:
- "10 Networking Papers: Recommended Reading," Jon Crowcroft. Online copy (PDF)
- "10 Papers for the Ph.D. Student in Networking," Craig Partridge. Online copy (PDF)
- "10 Networking Papers: Recommended Reading," Jim Kurose. Online copy (PDF)
- "10 Networking Papers: Reading for Protocol Design," David Wetherall. https://dl.acm.org/doi/pdf/10.1145/1140086.1140096
- "10 Networking Papers: A Blast from the Past," Mostafa H. Ammar.
Operating systems[edit]
An experimental timesharing system.[edit]
- Fernando J. Corbató, M. Merwin-Daggett, and R.C. Daley
- Proceedings of the AFIPS FJCC, pages 335–344, 1962.
- Online copy (HTML)
Description: This paper discuss time-sharing as a method of sharing computer resource. This idea changed the interaction with computer systems.
The Working Set Model for Program Behavior[edit]
- Peter J. Denning
- Communications of the ACM, Vol. 11, No. 5, May 1968, pp 323–333
- Online version(PDF)
Description: The beginning of cache. For more information see SIGOPS Hall of Fame Archived 2006-09-21 at the Wayback Machine.
Virtual Memory, Processes, and Sharing in MULTICS[edit]
- Robert C. Daley, Jack B. Dennis
- Communications of the ACM, Vol. 11, No. 5, May 1968, pp. 306–312.
- Online version(PDF)
Description: The classic paper on Multics, the most ambitious operating system in the early history of computing. Difficult reading, but it describes the implications of trying to build a system that takes information sharing to its logical extreme. Most operating systems since Multics have incorporated a subset of its facilities.
The nucleus of a multiprogramming system[edit]
- Per Brinch Hansen
- Communications of the ACM, Vol. 13, No. 4, April 1970, pp. 238–242
- Online version(PDF)
Description: Classic paper on the extensible nucleus architecture of the RC 4000 multiprogramming system, and what became known as the operating system kernel and microkernel architecture.
Operating System Principles[edit]
- Per Brinch Hansen
- Prentice Hall, Englewood Cliffs, NJ, July 1973
- Online version (ACM Digital Library)
Description: The first comprehensive textbook on operating systems. Includes the first monitor notation (Chapter 7).
A note on the confinement problem[edit]
- Butler W. Lampson
- Communications of the ACM, 16(10):613–615, October 1973.
- Online version(PDF)
Description: This paper addresses issues in constraining the flow of information from untrusted programs. It discusses covert channels, but more importantly it addresses the difficulty in obtaining full confinement without making the program itself effectively unusable. The ideas are important when trying to understand containment of malicious code, as well as aspects of trusted computing.
The UNIX Time-Sharing System[edit]
- Dennis M. Ritchie and Ken Thompson
- Communications of the ACM 17(7), July 1974.
- Online copy
Description: The Unix operating system and its principles were described in this paper. The main importance is not of the paper but of the operating system, which had tremendous effect on operating system and computer technology.
Weighted voting for replicated data[edit]
- David K. Gifford
- Proceedings of the 7th ACM Symposium on Operating Systems Principles, pages 150–159, December 1979. Pacific Grove, California
- Online copy (few formats)
Description: This paper describes the consistency mechanism known as quorum consensus. It is a good example of algorithms that provide a continuous set of options between two alternatives (in this case, between the read-one write-all, and the write-one read-all consistency methods). There have been many variations and improvements by researchers in the years that followed, and it is one of the consistency algorithms that should be understood by all. The options available by choosing different size quorums provide a useful structure for discussing of the core requirements for consistency in distributed systems.
Experiences with Processes and Monitors in Mesa[edit]
- Butler W. Lampson, David D. Redell
- Communications of the ACM, Vol. 23, No. 2, February 1980, pp. 105–117.
- Online copy (PDF)
Description: This is the classic paper on synchronization techniques, including both alternate approaches and pitfalls.
Scheduling Techniques for Concurrent Systems[edit]
- J. K. Ousterhout
- Proceedings of Third International Conference on Distributed Computing Systems, 1982, 22—30.
Description: Algorithms for coscheduling of related processes were given
A Fast File System for UNIX[edit]
- Marshall Kirk Mckusick, William N. Joy, Samuel J. Leffler, Robert S. Fabry
- IACM Transactions on Computer Systems, Vol. 2, No. 3, August 1984, pp. 181–197.
- Online copy (PDF)
Description: The file system of UNIX. One of the first papers discussing how to manage disk storage for high-performance file systems. Most file-system research since this paper has been influenced by it, and most high-performance file systems of the last 20 years incorporate techniques from this paper.
The Design of the UNIX Operating System[edit]
- Maurice J. Bach, AT&T Bell Labs
- Prentice Hall • 486 pp • Published 05/27/1986
This definitive description principally covered the System V Release 2 kernel, with some new features from Release 3 and BSD.
The Design and Implementation of a Log-Structured File System[edit]
- Mendel Rosenblum, J. K. Ousterhout
- ACM Transactions on Computer Systems, Vol. 10, No. 1 (February 1992), pp. 26–52.
- Online version
Description: Log-structured file system.
Microkernel operating system architecture and Mach[edit]
- David L. Black, David B. Golub, Daniel P. Julin, Richard F. Rashid, Richard P. Draves, Randall W. Dean, Alessandro Forin, Joseph Barrera, Hideyuki Tokuda, Gerald Malan, David Bohman
- Proceedings of the USENIX Workshop on Microkernels and Other Kernel Architectures, pages 11–30, April 1992.
Description: This is a good paper discussing one particular microkernel architecture and contrasting it with monolithic kernel design. Mach underlies Mac OS X, and its layered architecture had a significant impact on the design of the Windows NT kernel and modern microkernels like L4. In addition, its memory-mapped files feature was added to many monolithic kernels.
An Implementation of a Log-Structured File System for UNIX[edit]
- Margo Seltzer, Keith Bostic, Marshall Kirk McKusick, Carl Staelin
- Proceedings of the Winter 1993 USENIX Conference, San Diego, CA, January 1993, 307-326
Description: The paper was the first production-quality implementation of that idea which spawned much additional discussion of the viability and short-comings of log-structured filesystems. While "The Design and Implementation of a Log-Structured File System" was certainly the first, this one was important in bringing the research idea to a usable system.
Soft Updates: A Solution to the Metadata Update problem in File Systems[edit]
- G. Ganger, M. McKusick, C. Soules, Y. Patt
- ACM Transactions on Computer Systems 18, 2. pp 127–153, May 2000
- [9]Online version
Description: A new way of maintaining filesystem consistency.
Programming languages[edit]
The FORTRAN Automatic Coding System[edit]
- John Backus et al.[10]
- Proceedings of the WJCC (Western Joint Computer Conference), Los Angeles, California, February 1957.
- Online version(PDF)
Description: This paper describes the design and implementation of the first FORTRAN compiler by the IBM team. Fortran is a general-purpose, procedural, imperative programming language that is especially suited to numeric computation and scientific computing.
Recursive functions of symbolic expressions and their computation by machine, part I[11][edit]
- John McCarthy.
- Communications of the ACM, 3(4):184–195, April 1960.
- Several online versions
Description: This paper introduced LISP, the first functional programming language, which was used heavily in many areas of computer science, especially in AI. LISP also has powerful features for manipulating LISP programs within the language.
ALGOL 60[edit]
- Revised Report on the Algorithmic Language Algol 60 by Peter Naur, et al. – The very influential ALGOL definition; with the first formally defined syntax.
- Brian Randell and L. J. Russell, ALGOL 60 Implementation: The Translation and Use of ALGOL 60 Programs on a Computer. Academic Press, 1964. The design of the Whetstone Compiler. One of the early published descriptions of implementing a compiler. See the related papers: Whetstone Algol Revisited Archived 2008-02-27 at the Wayback Machine, and The Whetstone KDF9 Algol Translator by Brian Randell
- Edsger W. Dijkstra, Algol 60 translation: an Algol 60 translator for the x1 and making a translator for Algol 60, report MR 35/61. Mathematisch Centrum, Amsterdam, 1961.[12]
Description: Algol 60 introduced block structure.
The next 700 programming languages[11][edit]
- Peter Landin
- Communications of the ACM 9(3):157–65, March 1966[13]
Description: This seminal paper proposed an ideal language ISWIM, which without being ever implemented influenced the whole later development.
Fundamental Concepts in Programming Languages[edit]
Description: Fundamental Concepts in Programming Languages introduced much programming language terminology still in use today, including R-values, L-values, parametric polymorphism, and ad hoc polymorphism.
Lambda Papers[edit]
- Gerald Jay Sussman and Guy L. Steele, Jr.
- AI Memos, 1975–1980
- Links to pdf's
Description: This series of papers and reports first defined the influential Scheme programming language and questioned the prevailing practices in programming language design, employing lambda calculus extensively to model programming language concepts and guide efficient implementation without sacrificing expressive power.
Structure and Interpretation of Computer Programs[edit]
- Harold Abelson and Gerald Jay Sussman
- MIT Press, 1984, 1996
Description: This textbook explains core computer programming concepts, and is widely considered a classic text in computer science.
Comprehending Monads[edit]
- Philip Wadler
- Mathematical structures in computer science 2.04 (1992): 461–493.
- Online copy
Description: This paper introduced monads to functional programming.
Towards a Theory of Type Structure[edit]
- John Reynolds
- Programming Symposium. Springer Berlin Heidelberg, 1974.
- online copy
Description: This paper introduced System F and created the modern notion of Parametric polymorphism
An axiomatic basis for computer programming[edit]
- Tony Hoare
- Communications of the ACM, Volume 12 Issue 10, Oct. 1969, Pages 576-580
Description: This paper introduce Hoare logic, which forms the foundation of program verification
Scientific computing[edit]
- Wilkinson, J. H.; Reinsch, C. (1971). Linear algebra, volume II of Handbook for Automatic Computation. Springer. ISBN 978-0-387-05414-8. Search this book on
- Golub, Gene H.; van Loan, Charles F. (1996) [1983], Matrix Computations, 3rd edition, Johns Hopkins University Press, ISBN 978-0-8018-5414-9
Computational linguistics[edit]
- Booth, T. L. (1969). "Probabilistic representation of formal languages". IEEE Conference Record of the 1969 Tenth Annual Symposium on Switching and Automata Theory. pp. 74–81.
- Contains the first presentation of stochastic context-free grammars.
- Koskenniemi, Kimmo (1983), Two-level morphology: A general computational model of word-form recognition and production (PDF), Department of General Linguistics, University of Helsinki, archived from the original (PDF) on 2018-12-21, retrieved 2010-01-10 Unknown parameter
|url-status=
ignored (help)
- The first published description of computational morphology using finite state transducers. (Kaplan and Kay had previously done work in this field and presented this at a conference; the linguist Johnson had remarked the possibility in 1972, but not produced any implementation.)
- Rabiner, Lawrence R. (1989). "A tutorial on hidden Markov models and selected applications in speech recognition". Proceedings of the IEEE. 77 (2): 257–286. CiteSeerX 10.1.1.381.3454. doi:10.1109/5.18626. Unknown parameter
|s2cid=
ignored (help)
- An overview of hidden Markov models geared toward speech recognition and other NLP fields, describing the Viterbi and forward-backward algorithms.
- Brill, Eric (1995). "Transformation-based error-driven learning and natural language processing: A case study in part-of-speech tagging". Computational Linguistics. 21 (4): 543–566.
- Describes a now commonly used POS tagger based on transformation-based learning.
- Manning, Christopher D.; Schütze, Hinrich (1999), Foundation of Statistical Natural Language Processing, MIT Press
- Textbook on statistical and probabilistic methods in NLP.
- Frost, Richard A. (2006). "Realization of Natural-Language Interfaces Using Lazy Functional Programming" (PDF). ACM Computing Surveys. 38 (4): 11–es. CiteSeerX 10.1.1.114.4151. doi:10.1145/1177352.1177353. Unknown parameter
|s2cid=
ignored (help)
- This survey documents relatively less researched importance of lazy evaluation in functional programming languages (i.e., Haskell) to conduct natural language processing and accommodate many linguistic theories.
Software engineering[edit]
Software engineering: Report of a conference sponsored by the NATO Science Committee[edit]
- Peter Naur, Brian Randell (eds.)
- Garmisch, Germany, 7–11 October 1968, Brussels, Scientific Affairs Division, NATO (1969) 231pp.
- Online copy (PDF)
Description: Conference of leading people in software field c. 1968
The paper defined the field of Software engineering
A Description of the Model-View-Controller User Interface Paradigm in the Smalltalk-80 System[14][edit]
- Krasner, Glenn E.; Pope, Stephen T.
- The Journal of Object Technology, Aug-Sep 1988
- Online copy (PDF) Archived 2016-03-14 at the Wayback Machine
Description: A description of the system that originated the (now dominant) GUI programming paradigm of Model–view–controller
Go To Statement Considered Harmful[11][edit]
- Dijkstra, E. W.
- Communications of the ACM, 11(3):147–148, March 1968
- Online copy Archived 2018-02-10 at the Wayback Machine
Description: Don't use goto – the beginning of structured programming.
On the criteria to be used in decomposing systems into modules[edit]
- David Parnas
- Communications of the ACM, Volume 15, Issue 12:1053–1058, December 1972.
- Online copy (PDF)
Description: The importance of modularization and information hiding. Note that information hiding was first presented in a different paper of the same author – "Information Distributions Aspects of Design Methodology", Proceedings of IFIP Congress '71, 1971, Booklet TA-3, pp. 26–30
Hierarchical Program Structures[edit]
- Ole-Johan Dahl, C. A. R. Hoare
- in Dahl, Dijkstra and Hoare, Structured Programming, Academic Press, London and New York, pp. 175–220, 1972.
Description: The beginning of Object-oriented programming. This paper argued that programs should be decomposed to independent components with small and simple interfaces. They also argued that objects should have both data and related methods.
A Behavioral Notion of Subtyping[edit]
- Barbara H. Liskov, Jeannette M. Wing
- ACM Transactions on Programming Languages and Systems (TOPLAS),1994
Description: Introduces Liskov substitution principle and establishes behavioral subtyping rules.
A technique for software module specification with examples[edit]
- David Parnas
- Comm. ACM 15, 5 (May 1972), 330–336.
- Online copy (PDF)
Description: software specification.
Structured Design[edit]
- Wayne Stevens, Glenford Myers, and Larry Constantine
- IBM Systems Journal, 13 (2), 115–139, 1974.
- On-line copy (PDF)
Description: Seminal paper on Structured Design, data flow diagram, coupling, and cohesion.
The Emperor's Old Clothes[edit]
- C.A.R. Hoare
- Communications of the ACM, Vol. 24, No. 2, February 1981, pp. 75–83.
- Archived copy (PDF)
Description: Illustrates the "second-system effect" and the importance of simplicity.
The Mythical Man-Month: Essays on Software Engineering[edit]
- Brooks, Jr., F. P.
- Addison Wesley Professional. 2nd edition, 1995.
Description: Throwing more people at the task will not speed its completion...
No Silver Bullet: Essence and Accidents of Software Engineering[edit]
- Fred Brooks
- — (April 1987). "No Silver Bullet — Essence and Accidents of Software Engineering". IEEE Computer. 20 (4): 10–19. CiteSeerX 10.1.1.117.315. doi:10.1109/MC.1987.1663532. Unknown parameter
|s2cid=
ignored (help)
Description: Brooks argues that "there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity." He also states that "we cannot expect ever to see two-fold gains every two years" in software development, as there is in hardware development (Moore's law).
The Cathedral and the Bazaar[edit]
- Raymond, E.S.
- First Monday, 3, 3 (March 1998)
- Online copy (HTML)
Description: Open source methodology.
Design Patterns: Elements of Reusable Object Oriented Software[edit]
- E. Gamma, R. Helm, R. Johnson, J. Vlissides
- Addison–Wesley, Reading, Massachusetts, 1995.
Description: This book was the first to define and list design patterns in computer science.
Statecharts: A Visual Formalism For Complex Systems[edit]
- David Harel
- D. Harel. Statecharts: A visual formalism for complex systems. Science of Computer Programming, 8:231—274, 1987
- Online version
Description: Statecharts are a visual modeling method. They are an extension of state machine that might be exponentially more efficient. Therefore, statcharts enable formal modeling of applications that were too complex before. Statecharts are part of the UML diagrams.
Security and privacy[edit]
Anonymity and Privacy[edit]
- David Chaum. Untraceable electronic mail, return addresses, and digital pseudonyms. Communications of the ACM, 4(2):84–88, February 1981.
- Cynthia Dwork, Frank McSherry, Kobbi Nissim, Adam Smith. Calibrating Noise to Sensitivity in Private Data Analysis, In Theory of Cryptography Conference (TCC), Springer, 2006. doi:10.1007/11681878_14. The full version appears in Journal of Privacy and Confidentiality, 7 (3), 17–51. doi:10.29012/jpc.v7i3.405
Cryptography[edit]
- Whitfield Diffie and Martin E. Hellman, New Directions in Cryptography, IEEE Transactions on Information Theory, November 1976
- R. L. Rivest and A. Shamir and L. M. Adelman, A Method For Obtaining Digital Signatures And Public-Key Cryptosystems, MIT/LCS/TM-82, 1977
- Merkle, R. Security, Authentication, and Public Key Systems Archived 2018-08-14 at the Wayback Machine, PhD Thesis, 1979 Stanford University.
Passwords[edit]
- Morris, Robert and Thompson, Ken. Password security: a case history, Communications of the ACM CACM Homepage archive Volume 22 Issue 11, Nov. 1979 Pages 594–597. PDF Archived 2016-03-05 at the Wayback Machine
System security[edit]
- Dennis and Van Horn, Programming Semantics for Multiprogrammed Computations, ACM Conference on Programming Languages and Pragmatics (August 1965)
- Saltzer and Schroeder, The Protection of Information in Computer Systems Archived 2016-03-23 at the Wayback Machine, ACM Symposium on Operating System Principles (October 1973) HTML HTML2
- Karger and Schell, Thirty Years later: Lessons from the Multics Security Evaluation, ACSAC 2002
- Lampson, Butler. A Note on the Confinement Problem, Communications of the ACM, 16:10 (Oct. 1973), pp. 613–615. PDF
- Thompson, Ken. Reflections on Trusting Trust, Communications of the ACM, 27:8, Aug 1984
- J.E. Forrester and B.P. Miller, An Empirical Study of the Robustness of Windows NT Applications Using Random Testing[permanent dead link], 4th USENIX Windows Systems Symposium, Seattle, August 2000.
Usable security[edit]
- Whitten, Alma and Tygar, J.D., Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0, Proceedings of the 8th conference on USENIX Security Symposium, Volume 8, August 1999, Pages 14–28
- Garfinkel, Simson and Shelat, Abhi, Remembrance of Data Passed, IEEE Security and Privacy, Volume 1 Issue 1, January 2003, Page 17-27
Theoretical computer science[edit]
Topics covered: theoretical computer science, including computability theory, computational complexity theory, algorithms, algorithmic information theory, information theory and formal verification.
See also[edit]
- DBLP (Digital Bibliography & Library Project in computer science)
- List of open problems in computer science
- List of computer science journals
- List of computer science conferences
- The Collection of Computer Science Bibliographies
- Paris Kanellakis Award, a prize given to honor specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing.
References[edit]
- ↑ Naaman, Michael (2021). "On the tight constant in the multivariate Dvoretzky-Kiefer-Wolfowitz inequality". Statistics and Probability Letters. 173: 109088. doi:10.1016/j.spl.2021.109088.
- ↑ Linnainmaa, Seppo (1970). The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors. Master's Thesis, Univ. Helsinki, 6-7.
- ↑ Griewank, Andreas (2012). Who Invented the Reverse Mode of Differentiation? Optimization Stories, Documenta Matematica, Extra Volume ISMP (2012), 389-400.
- ↑ Werbos, P.. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD thesis, Harvard University, 1974
- ↑ Laplante 1996, p. 150
- ↑ Laplante 1996, p. 208
- ↑ The rendering equation
- ↑ Kass, M.; Witkin, A.; Terzopoulos, D. (1988). "Snakes: Active contour models" (PDF). International Journal of Computer Vision. 1 (4): 321. CiteSeerX 10.1.1.124.5318. doi:10.1007/BF00133570. Archived from the original (PDF) on 2016-01-12. Retrieved 2015-08-28. Unknown parameter
|url-status=
ignored (help); Unknown parameter|s2cid=
ignored (help) - ↑ Behrouz Forouzan. "Data communication and networking book". McGraw Hill Education. Archived from the original on 4 September 2014. Retrieved 1 Jan 2013. Unknown parameter
|url-status=
ignored (help) - ↑ Laplante 1996, p. 62
- ↑ 11.0 11.1 11.2 Pierce, Benjamin C. (2004). "Great works in programming languages". Penn Engineering.
- ↑ "Archived copy" (PDF). Archived from the original (PDF) on 2007-02-04. Retrieved 2007-02-26. Unknown parameter
|url-status=
ignored (help)CS1 maint: Archived copy as title (link) - ↑ "Google Академія". Archived from the original on 2015-05-14. Retrieved 2016-11-14. Unknown parameter
|url-status=
ignored (help) - ↑ Model View Controller History Archived 2011-05-15 at the Wayback Machine. C2.com (2012-05-11). Retrieved on 2013-12-09.
Works cited[edit]
- Laplante, Phillip, ed. (1996). Great papers in computer science. New York: IEEE Press. ISBN 978-0-314-06365-6. Search this book on
- Randell, Brian (ed). (1982). The Origins of Digital Computers: Selected Papers. 3rd ed. Berlin: Springer-Verlag. ISBN 0-387-11319-3 Search this book on ..
- Turning Points in Computing: 1962–1999, Special Issue, IBM Systems Journal, 38 (2/3), 1999.
- Yourdon, Edward (ed.) (1979) Classics in Software Engineering. New York: Yourdon Press. ISBN 0-917072-14-6 Search this book on .
External links[edit]
This article "List of important publications in computer science" is from Wikipedia. The list of its authors can be seen in its historical and/or the page Edithistory:List of important publications in computer science. Articles copied from Draft Namespace on Wikipedia could be seen on the Draft Namespace of Wikipedia and not main one.