Honorary Day 2009 for Robert L. Constable                   
Ben-Gurion University
Ben-Gurion University
Honorary Day 2009
Abstracts and BiosCreating Knowledge in the Age of Digital Information Robert L. Constable, Cornell University Abstract:
My perspective on computer science changed when I served as CS department chair, and changed again as CIS dean.
I will discuss these changes and illustrate them with examples of discoveries at the intersection of computing with other disciplines that enrich the core of computer science, concluding with a viewpoint not yet widely accepted -- but related to the title of the talk.
Bio:
Robert Constable is the founding dean of the Faculty of Computing and Information Science
(CIS) at Cornell University where he has been a professor since 1968. He served six years as chair of
the Computer Science Department which is now one of the Departments in CIS along with Information Science,
and Statistics. CIS also includes the Program in Computer Graphics.
Professor Constable works in the area of formal methods and applied logic
and is known as one of the founders of computational type theory and for leading
the effort to build one of the first theorem provers (Nuprl) based on this theory.
The theory implements the well known "proofs as programs" principle which he has recently
extended to the "proofs as processes principle" applicable in distributed computing.
Proving Church's Thesis Nachum Dershowitz, Tel-Aviv University Abstract:
Church’s Thesis asserts that the only numeric functions that can be calculated by effective means are the recursive ones, which are the same (extensionally) as the Turing-computable numeric functions. Yuri Gurevich's Abstract State Machine Theorem states that every classical algorithm is emulated (step for step) by an abstract state machine, which is a most generic model of sequential computation.
That theorem presupposes three natural postulates about algorithmic computation. By augmenting those postulates with an additional requirement regarding basic operations, a natural axiomatization of computability and a proof of Church’s Thesis obtain, as Godel and others suggested may be possible.
(Joint work with Yuri Gurevich.)
Bio:
Nachum Dershowitz is a professor of computer science at Tel Aviv University.
His research interests include rewriting theory, termination orderings, abstract state
machines, natural language processing, Boolean satisfiability, and calendrical algorithms.
Modular Approach for Developing Robust Protocols Danny Dolev, Hebrew University of Jerusalem Abstract:
A decade ago Robert L. Constable and his team proved the correctness of Ensamble, a multi layer group communication system using the Nuprl formal system.
In the talk we will present an expected constant number of rounds protocol to synchronize nodes, despite
Byzantine and transient faults. The protocol is composed of a stack of modules that were developed in several fields of
Computer Science in the last couple of decades. Proving in a formal way the correctness of the construction or of some components
in it will be the next challenge for Constable's team.
Bio:
TBA
Synthesis of Programs from Temporal Property
Specifications Amir Pnueli, Weizmann Institute of Science & NYU Abstract:
The current state of the art uses specification of properties in a behavioral
form, such as linear temporal logic, mainly for static and dynamic verification of
implementations of reactive systems, AFTER they have been constructed. In the talk
we consider the more radical approach by which a specification in terms of its
desired properties is formed at the beginning of a system development, before any
implementation is attempted. After some analysis and validation of this specification,
one may apply a synthesis algorithm in order to automatically derive a correct-byconstruction
implementation.
No further verification or testing is required. While this has been considered an
unachievable dream for a long time, recent progress in the algorithms for synthesis
has made design synthesis from property specification applicable for medium size
designs. We provide more details about the problem of synthesizing digital designs
from their LTL (and similar) specifications. In spite of the theoretical double
exponential lower bound, we show that for many expressive specifications of
hardware designs the problem can be solved in time N^3. We describe the context of
the problem, as part of the Prosyd European Project which aims to provide a
property-based development flow for hardware designs. Within this project, synthesis
plays an important role, first in order to check whether a given specification is
realizable, and then for synthesizing part of the developed system.
The above results were obtained for the synthesis of synchronous systems. In the talk,
we will consider the generalization of the problem to the case of asynchronous
systems. This will enable the synthesis of programs that communicate with their
environments by shared variables. We will mainly report about work in progress,
which provides different approaches and heuristics for establishing unrealizability
and realizability of asynchronous specifications. This is a joint work with Nir
Piterman, Yaniv Sa'ar, and Uri Klein.
Bio:
Amir Pnueli is a Silver Professor of Computer Science at the Courant Institute
of Mathematical Sciences at New York University, since 1999. Prof. Pnueli is the
1996 recipient of the ACM Turing award, and the 2000
recipient of the Israel prize in the category of Exact Sciences.
In 2008, Pnueli received the 2007 Software System Award, as a member of
the team that developed the software engineering tool Statemate.
Prof. Pnueli is mainly known for the introduction of temporal logic
into Computer Science; his work on the application of temporal logic
and similar formalisms for the specification and verification of
reactive systems; the identification of the class of "Reactive
Systems" as systems whose formal specification, analysis, and
verification require a distinctive approach; and the development of a
rich and detailed methodology, based on temporal logic, for the formal
treatment of reactive system; extending this methodology into the
realm of real-time systems; and more recently, introducing into formal
analysis the models of hybrid systems with appropriate extension of
the temporal-logic based methodology.
Beside his more theoretical work, concerning a complete axiom system and
proof theory for program verification by temporal logic, he also
contributed to algorithmic research in this area. He developed a
deductive system for linear-time temporal logic and model-checking
algorithms for the verification of temporal properties of finite-state
systems. Together with David Harel, Pnueli worked on the semantics and
implementation of Statecharts, a visual language for the specification,
modeling, and prototyping of reactive systems. This language has been
applied to avionics, transport, and electronic hardware systems. His
current research interests involve synthesis of reactive modules,
automatic verification of multi-process systems, and specification
methods that combine transition systems with temporal logic.
Lexical Cohesion in Texts - Extraction Methods and Applications Eli Shamir, Hebrew University of Jerusalem Abstract:
What makes texts cohesive (beyond being random word-sets), fit to communicate ideas, emotions, descriptions, instructions? Grammatical
cohesion bonds are easy to explicate (agreement rules, pronouns, conjunctions, syntax…). Lexical cohesion is hard. It draws upon semantics and world-knowledge in
millions minds. Alternatively, civilization created repositories: dictionaries, data-corpora, etc, and modern technologies to process them quickly. We designed an annotation task
(experimented on 22 readers, 10 texts, 600-1200 words each), to give each new text word an anchor link to a previous word, if the pair is semantically related under general
common knowledge. Results of this experiment were analyzed in statistically novel way to extract a core of 1261 lexically cohesive pairs, on which there is a very high agreement.
This sizable text-based core proved useful, as Teacher’s “gold standard”, for designing automatic extraction of lexically cohesive pairs from texts, based on relevant features drawn
from the text and the repositories together. Supervised Learning is evoked to build decision trees on these features, calibrated by the experimental core. Several other correlations
and applications include (BBK et al.) studies of political texts, speeches of prominent leaders, to extracts dominant themes, beliefs, creeds and even typical metaphoric styles.
Bio:
TBA
Cluster Based Computation of Relational Joins Jeffrey D. Ullman, Stanford University Abstract:
The prevalence of large racks of interconnected processor nodes forces us to take another
look at how to exploit parallelism when taking the join of large relations. Sometimes, there is a gain in
total cost to be had by distributing pieces of each relation to several different nodes and computing the join of several large
relations at once. The optimization problem is to pick the degree of replication of each relation, under the constraint that the
total number of compute-nodes is fixed. We set up this problem as a nonlinear optimization and show that there is always a solution
(which must be approximated by rounding to the nearest integers). For some of the most common types of join -- star joins and chain joins --
we give closed-form solutions to the optimization problem. Finally, we point out that the join algorithm we propose can be implemented using
features already present in Hadoop, the open-source implementation of map-reduce.
Bio:
Jeffrey D. Ullman is currently the Stanford W. Ascherman Professor of Computer Science (Emeritus) at Stanford University,
as well as CEO of the Gradiance Corporation. He received a Bachelor of Science degree in Engineering Mathematics from Columbia University
and his Ph.D. in Electrical Engineering from Princeton University. Previously he worked at Bell Labs for several years and then as a professor at Princeton University.
Since 1979 he has been a professor at Stanford University. In 1995 he was inducted as a Fellow of the Association for Computing Machinery and in 2000 he was awarded
the Knuth Prize.
His research interests include database theory, data integration, data mining, and education using the information infrastructure. His textbooks on compilers, data
structures, theory of computation, and databases are regarded as standards in their fields. He is one of the founders of the field of database theory, and was the doctoral
advisor of an entire generation of students who later became leading database theorists in their own right. He was the Ph.D. advisor of Sergey Brin who was one of the
co-founders of Google. Professor Ullman also served on Google's technical advisory board.
Constraints, Graphs, Algebra, Logic, and Complexity Moshe Vardi, Rice University Abstract:
A large class of problems in AI and other areas of computer science
can be viewed as constraint-satisfaction problems.
This includes problems in database query optimization, machine vision, belief maintenance, scheduling, temporal reasoning, type reconstruction,
graph theory, and satisfiability. All of these problems can be recast as questions regarding the existence of homomorphisms between two directed graphs. It is well-known that the constraint-satisfaction problem is NP-complete. This motivated an extensive research program into identify tractable cases of constraint satisfaction.
This research proceeds along two major lines. The first line of research
focuses on non-uniform constraint satisfaction, where the target graph is
fixed. The goal is to identify those traget graphs that give rise to a
tractable constraint-satisfaction problem. The second line of research focuses on identifying large classes of source graphs for which
constraint-satisfaction is tractable. We show in this talk how tools
from graph theory, universal algebra, logic, and complexity theory, shed light on the tractability of constraint satisfaction.
Bio:
Moshe Y. Vardi is the George Professor in Computational Engineering and Director of the Ken Kennedy Institute for Information Technology at Rice University. He chaired the Computer Science Department at Rice University from January 1994 till June 2002. Prior to joining Rice in 1993, he was at the IBM Almaden Research Center, where he managed the Mathematics and Related Computer Science Department. His research interests include database systems, computational-complexity theory, multi-agent systems, and design specification and verification. Vardi received his Ph.D. from the Hebrew University of Jerusalem in 1981. He is the author and co-author of over 350 articles, as well as two books, "Reasoning about Knowledge" and "Finite Model Theory and Its Applications", and the editor of several collections. He is currently the Editor-in-Chief of the Communications of the ACM.
Vardi is the recipient of three IBM Outstanding Innovation Awards, a co-winner of the 2000 Goedel Prize, a co-winner of the 2005 ACM Kanellakis Award for Theory and Practice, a co-winner of the 2006 LICS Test-of-Time Award, a co-winner of the 2008 ACM PODS Mendelzon Test-of-Time Award, a winner of the 2008 ACM SIGMOD Codd Innovations Award, a recipient of the 2008 Blaise pascal Medal for Computer Science by the European Academy of Sciences, as well as a 2008 ACM Presidential Award. He holds honorary doctorates from the University of Saarland, Germany, and the University of Orleans, France. Vardi is an editor of several international journals, and Editor-in-Chief of the Communication of ACM. He is Guggenheim Fellow, as well as a Fellow of the Association of Computing Machinery, the American Association for the Advancement of Science, the Association for the Advancement of Artificial Intelligence, and the Institute for Electrical and Electronic Engineers. He was designated Highly Cited Researcher by the Institute for Scientific Information, and was elected as a member of the US National Academy of Engineering, the European Academy of Sciences, and the Academia Europea.