Navigation auf uzh.ch

Suche

Department of Informatics

Details Colloquium Fall 2017

21.09.2017 - Edgeware, the Open Specifications Model, and the Internet Backpack

Speaker: Prof. Lee McKnight, Ph.D.

Host: Prof. Dr. Burkhard Stiller

Abstract

Human and cyberphysical systems are always susceptible to a variety of threats. Perhaps we might prefer systems that will at least not do – foolish things. An Open Specifications Model has been developed around a 2-part cyber-physical kernel by researchers and students from many nations in part through several National Science Foundation Partnership for Innovation projects. The work began in 2002 as exploratory research on what we now call the Internet of Things, cloud services, and edgeware, and soon will call 5G+ advanced wireless networks. Nine doctoral theses have been authored and thousands of academic and industry researchers, school teachers and schoolchildren, as well as Enterprise CIOs, government and civil society members, have contributed to date in various ways including to academic and professional conferences, publications and journal articles, and standards, reference architecture, and framework specification organizations.

The Open Specifications Model v0.5 in development has many technical mechanisms, as well as law, policy and economic elements and features; and is far from complete, and must continually evolve as for example blockchain and Internet of Things were added to v0.4. This talk explores the virtual market questions stimulating the research originally, reviews the insights from the ‘cloud to edge’ focus of the current phase, and suggests directions for future edgeware research and development.   Innovation Zones in Democratic Republic of Congo and testbeds in the United States permitting evaluation of edgeware and the Internet Backpack for emergency and education use will be described. Opportunities for further contributions to the model, IETF and other standards, and future research will be identified.

Bio

Prof. Lee W.McKnight, Ph.D., is an Associate Professor in the iSchool (The School of Information Studies), Syracuse University, an Affiliate of Syracuse University's Institute for National Security and Counterterrorism (INSTC), and lectures annually at MIT since 1998. Lee was Principal Investigator of the National Science Foundation Partnerships for Innovation Wireless Grids Innovation Testbed (WiGiT) project 2009-2014, which was recipient of the 2011 TACNY Award for Technology Project of Year. Lee is inventor of edgeware, a new class of software for creating secure ad hoc overlay cloud to edge applications. Lee’s research focuses on cloud management of dynamic edge services, virtual markets and wireless grids, and Internet governance. McKnight teaches graduate and undergraduate courses such as Blockchain Management, Cloud Architecture, Cloud Management, Information Security Policy (joint with Syracuse Law School/INSCT), and Information Policy at Syracuse University, and lectures annually in the MIT Professional Education short course, 'Technology, Organizations, and Innovation: Putting Ideas to Work.' In addition to many peer reviewed journal articles in technical and policy journals, his academic work includes several path-breaking books. Lee received a Ph.D. in 1989 from MIT; an M.A. from the School of Advanced International Studies, Johns Hopkins University in 1981; and a B.A. magna cum laude from Tufts University in 1978.​

26.10.2017 - Mining Large Cultural Heritage Corpora Using Deep Learning Methods

Speaker: Prof. Frédéric Kaplan, Ph.D.

Host: Prof. Dr. Martin Volk

Abstract

I will report on our ongoing investigations on three large-scale cultural heritage datasets: 4 Million Swiss newspapers articles covering a two hundred years period, 1 Million photographs of artworks currently under digitisation at the Cini Foundation and the Venice Time Machine continuously expanding corpora covering documents from a 1000 years period. The Swiss newspaper archives is sufficiently large to test word embeddings methods like Word2Vec, and study how they perform in diachronic contexts for which words progressively change meanings as language itself evolves. On the Artworks databases we are using convolutional neural networks for finding similarity between paintings, engravings, drawings and sculpture and design architectures for efficiently spotting matching details. Eventually, we combine these two approaches to try to crack one of the hardest problem of the Venice Time Machine: the direct projection of graphical forms in semantic spaces without passing through the currently impractical full textual transcription of the digitised documents.

Bio

Prof. Frédéric Kaplan, Ph.D., holds the Digital Humanities Chair at Ecole Polytechnique Fédérale de Lausanne (EPFL) and directs the EPFL Digital Humanities Laboratory (DHLAB). He conducts research projects combining archive digitisation, information modelling and museographic design. He is currently directing the "Venice Time Machine", an international project in collaboration with the Ca'Foscari University in Venice and the Venice State Archives, aiming to model the evolution and history of Venice over a 1000 year period.

09.11.2017 - Incentive Auctions and Spectrum Repacking: A Case Study for "Deep Optimization"

Speaker: Prof. Kevin Leyton-Brown, Ph.D.

Host: Prof. Dr. Sven Seuken

Abstract

Over 13 months in 2016--17 the US Federal Communications Commission conducted an "incentive auction" to repurpose radio spectrum from broadcast television to wireless internet. In the end, the auction yielded $19.8 billion USD, $10.05 billion USD of which was paid to 175 broadcasters for voluntarily relinquishing their licenses across 14 UHF channels. Stations that continued broadcasting were assigned potentially new channels to fit as densely as possible into the channels that remained. The government netted more than $7 billion USD (used to pay down the national debt) after covering costs (including retuning). A crucial element of the auction design was the construction of a solver, dubbed SATFC, that determined whether sets of stations could be "repacked" in this way; it needed to run every time a station was given a price quote.

This talk describes the process by which we built SATFC and its impact on the auction. We adopted an approach we dub "deep optimization", taking a data-driven, highly parametric, and computationally intensive approach to solver design. More specifically, to build SATFC we designed software that could pair both complete and local-search SAT-encoded feasibility checking with a wide range of domain-specific techniques, such as constraint graph decomposition and novel caching mechanisms that allow for reuse of partial solutions from related, solved problems. We then used automatic algorithm configuration techniques to construct a portfolio of eight complementary algorithms to be run in parallel, aiming to achieve good performance on instances that arose in proprietary auction simulations. We found that within the short time budget required in practice, SATFC solved more than 95% of the problems it encountered. Furthermore, simulation results showed that the incentive auction paired with SATFC produced nearly optimal allocations in a restricted setting and substantially outperformed other alternatives at national scale.

Bio

Prof. Kevin Leyton-Brown, Ph.D., is a professor of Computer Science at the University of British Columbia and an associate member of the Vancouver School of Economics. He holds a PhD and M.Sc. from Stanford University (2003; 2001) and a B.Sc. from McMaster University (1998). He studies the intersection of computer science and microeconomics, addressing computational problems in economic contexts and incentive issues in multiagent systems. He also applies machine learning to the automated design and analysis of algorithms for solving hard computational problems. He has co-written two books, "Multiagent Systems" and "Essentials of Game Theory," and over 100 peer-refereed technical articles. He is the recipient of UBC's 2015 Charles A. McDowell Award for Excellence in Research, a 2014 NSERC E.W.R. Steacie Memorial Fellowship—previously given to a computer scientist only 10 times since its establishment in 1965—and a 2013 Outstanding Young Computer Science Researcher Prize from the Canadian Association of Computer Science. He has co-taught two Coursera courses on "Game Theory" to over half a million students, and is chair of the ACM Special Interest Group on Electronic Commerce.

23.11.2017 - Big Data Management and Apache Flink: Key Challenges and (Some) Solutions

Speaker: Prof. Dr. Volker Markl

Hosts: Prof. Dr. Abraham Bernstein, Prof. Dr. Michael Böhlen

Abstract

The shortage of qualified data scientists is effectively limiting Big Data from fully realizing its potential to deliver insight and provide value for scientists, business analysts, and society as a whole. In order to remedy this situation, we believe that novel technologies that draw on the concepts of declarative languages, query optimization, automatic parallelization and hardware adaptation are necessary. In this talk, we will discuss several aspects of our research in this area, including results in how to optimize iterative data flow programs, optimistic fault-tolerance, and steps toward a deep language embedding of advanced data analysis programs. We will also discuss how our research activities have led to Apache Flink, an open-source big data analytics system, which by now has become a major data processing engine in the Apache Big Data Stack, used in a variety of applications by academia and industry.

Presentation Slides

Presentation Slides Volker Markl (PDF, 3 MB)

 

 

Bio

Prof. Dr. Volker Markl is a Full Professor and Chair of the Database Systems and Information Management (DIMA) group at the Technische Universitat Berlin (TU Berlin), director of the research group “Intelligent Analysis of Massive Data” at the German Research Center for Artificial Intelligence (DFKI), and speaker of the Berlin Big Data Center (BBDC). Earlier in his career, Volker Markl lead a research group at FORWISS, the Bavarian Research Center for Knowledge-based Systems in Munich, Germany, and was a Research Staff member & Project Leader at the IBM Almaden Research Center in San Jose, California, USA. Volker Markl has published numerous research papers on indexing, query optimization, lightweight information integration, and scalable data processing. He holds 19 patents, has transferred technology into several commercial products, and advises several companies and startups. He has been speaker and principal investigator of the Stratosphere research project that resulted in the "Apache Flink" big data analytics system. Volker Markl currently serves as the secretary of the VLDB Endowment and was elected as one of Germany's leading "digital minds" (Digitale Köpfe) by the German Informatics Society (GI). Volker Markl and his team earned an ACM SIGMOD Research Highlight Award 2016 for their work on implicit parallelism through deep language embedding.

14.12.2017 - Blockchain as an Enabler for e-Government and e-Democracy

Speaker: Daniel Gasteiger and Giorgio Zinetti

Host: Prof. Dr. Burkhard Stiller

Abstract

This colloquium's talk on "Blockchain as an Enabler for e-Government and e-Democracy" will focus on the opportunities and challenges that this new technology brings to governments and supranational organizations when it comes to the digitalization and democratization of societies. Blockchain technology is being introduced in many countries, for a variety of purposes, including: registration of movable and immovable assets, such as land titles, commercial records, intellectual property, wills, social protection, health care data, and pension systems. Blockchain solutions are available to conduct auctions, to promote transparency of the national and local budgets, to secure reliable vote counting in elections, and to create crowdfunding platforms enabling investors to trace expenditures on their projects. A special focus will be given to the opportunities that come with the digitalization of voting processes given the technologies core features including immutability, anonymity, and persistence.

Bio

Daniel Gasteiger has worked in financial services for more than 20 years. Starting out as an FX trader at Credit Suisse, he later joined UBS to work with hedge funds and third-party banks promoting UBS's business-to-business API solutions and Prime Brokerage services. In his last role at UBS, he built up and managed the Office of the Chairman as a Managing Director. His fascination for Blockchain technology led to the decision to start nexussquared in Fall 2015, a business consultancy platform based in Zurich. In September 2016 he founded Procivis AG, a Zurich based start-up that looks to leverage blockchain technology and smart phones to deliver its "e-government as a service" platform.

Giorgio Zinetti is the CTO of Procivis and holds a Master of Science degree in Computer Engineering from Politecnico di Milano. He did work previously with UBS, where he led a number of innovation projects within the UBS Wealth Management Innovation Group. Prior to this, he has worked as a software engineer. As Procivis CTO, he will design and implement Procivis's technology strategy in close collaboration with external technology partners. This includes the buildup of an in-house
team of expert blockchain and crypto engineers. In addition to the successful implementation of the Procivis eID+ solution, as part of the recently announced pilot project with the Canton of Schaffhausen, Giorgio Zinetti's immediate focus will be on the assessment of different blockchain protocols for digital identity and electronic voting services.

Table of contents

Weiterführende Informationen

Title

Teaser text