The environmental effects of transporting goods are of increasing concern to managers and policy makers. The use of fossil fuels, such as petrol and diesel oil, in transport produces air pollutants that can have a toxic effect on people and the environment. However, one of the main drivers for the concern over the environmental effects of freight transport has been the potential effects of the production of greenhouse gas (GHG) emissions on climate change from the use of carbon-based fuels.
Research into models for transportation and logistics has been active for many years. Much of the modelling has been with the aim of optimising economic objectives or improving measures of customer service. However, in recent years, more research has been undertaken where environmental objectives have also been considered, so that supply chain and other logistic services can be delivered in a more sustainable way.
For the OR analyst, there are many choices to be made about how to model freight transport operations. Can old models be revised with a simple change of objective or are more radical changes needed? We shall examine some of these choices and illustrate the issues with cases studies to show what contribution can be made to environmental and other objectives through the use of decision analytic models. This will include issues raised by the use of new technologies, such as the use of electric or other alternatively powered vehicles.
Richard Eglese is a Professor of Operational Research in the Department of Management Science at Lancaster University Management School.
He was President of the Operational Research Society in the UK in 2010-2011 and is currently a member of its General Council and Chair of its Publications Committee. He is also now President of EURO (The Association of European Operational Research Societies) until the end of 2018.
His research interests include combinatorial optimisation using mathematical programming and heuristic methods. He is concerned with applications to vehicle routing problems, particularly models for time-dependent problems and for problems in Green Logistics where environmental considerations are taken into account to provide more sustainable distribution plans.
In this talk I review a couple of applications on Big Data that I personally like and I try to explain my point of view as a Mathematical Optimizer – especially concerned with discrete (integer) decisions – on the subject. I advocate a tight integration of Machine Learning and Mathematical Optimization (among others) to deal with the challenges of decision-making in Data Science.
For such an integration I try to answer three questions:
Andrea Lodi received the PhD in System Engineering from the University of Bologna in 2000 and he has been Herman Goldstine Fellow at the IBM Mathematical Sciences Department, NY in 2005–2006. He has been full professor of Operations Research at DEI, University of Bologna between 2007 and 2015. Since 2015 is Canada excellence Research Chair in “Data Science for Real-time Decision Making” at the École Polytechnique de Montréal.
His main research interests are in Mixed-Integer Linear and Nonlinear Programming and Data Science and his work has received several recognitions including the IBM and Google faculty awards. He is author of more than 80 publications in the top journals of the field of Mathematical Optimization.
He serves as Editor for several prestigious journals in the area. He has been network coordinator and principal investigator of two large EU projects/networks, and, since 2006, consultant of the IBM CPLEX research and development team. Finally, Andrea Lodi is the co-principal investigator (together with Yoshua Bengio) of the project "Data Serving Canadians: Deep Learning and Optimization for the Knowledge Revolution", recently generously funded by the Canadian Federal Government under the Apogée Programme.
Market design uses economic theory, mathematical optimization, experiments, and empirical analysis to design market rules and institutions. Fundamentally, it asks how scarce resources should be allocated and how the design of the rules and regulations of a market affects the functioning and outcomes of that market. Operations Research has long dealt with resource allocation problems, but typically from the point of view of a single decision maker. In contrast, Microeconomics focused on strategic interactions of multiple decision makers. While early contributions to auction theory model single-object auctions, much recent theory in the design of multi-object auctions draws on linear and integer linear programming combined with game-theoretical solution concepts and principles from mechanism design. This led to interesting developments in theory and practical market designs.
The talk will first introduce a number of market design applications and show how discrete optimization is used for allocation and payment rules. These markets include industrial procurement, the sale of spectrum licenses, as well as cap-and-trade systems. In addition, we survey a number of theoretical developments and the role of integer and linear programming in recent models and market designs. Models of ascending multi-object auctions and approximation mechanisms will serve as examples. Finally, we will discuss limitations of existing models and research challenges. Some of these challenges arise from traditional assumptions about utility functions and social choice functions, which often do not hold in multi-object markets in the field. For example, financial constraints of bidders have long been ignored in theory, but they are almost always an issue in the field. Such deviations from standard assumptions lead to interesting theoretical questions, but also to very tangible problems in the design of markets.
Martin Bichler received his MSc degree from the Technical University of Vienna, and his Ph. D. as well as his Habilitation from the Vienna University of Economics and Business. He was working as a research fellow at UC Berkeley, and as research staff member at the IBM T. J. Watson Research Center, New York. Since 2003 he is full Professor at the Department of Informatics of the Technical University of Munich (TUM) and a faculty member at the TUM School of Management. He was a visiting scholar at the University of Cambridge (2008), HP Labs Palo Alto (2008), at Yale University (2016), and at Stanford University (2017). Martin is a faculty and board member of the Bavarian Elite Master program "Finance and Information Management" and a fellow of the Agora Group on Market Design at the University of New South Wales, Australia. He received an HP Labs eAward, the IBM Faculty Award, and the INFORMS ISS Design Science Award, and holds several patents. Since 2012 he is Editor-in-Chief of Business and Information Systems Engineering and serves on the editorial board of a number of journals including INFORMS ISR. Martin’s research interests include market design, mathematical optimization, game theory, and econometrics.
Artificial Intelligence and Machine Learning have become household names, hot topics avidly pushed by the media, with companies like Facebook, Google and Uber promising disruptive breakthroughs from speech recognition to self driving cars and fully-automatic predictive maintenance. However, in the forecasting world, reality looks very different. An industry survey of 200+ companies shows that despite substantial growth of available data, most companies still rely on human expertise or employ very basic statistical algorithms from the 1960s, with even market leaders slow to adopt advanced algorithms to enhance forecasting and demand planning decisions. This reveals a huge gap between scientific innovations and industry capabilities, with opportunities to gain unprecedented market intelligence being missed.
In this session, we will highlight examples of how industry thought leaders have successfully implemented artificial Neural Networks and advanced Machine Learning algorithms for forecasting, including FMCG Manufacturer Beiersdorf, Beer Manufacturers Anheuser Bush InBev, and Container Shipping line Hapag-Lloyd. I will leave you with a vision not of the future, but of what’s happening now, and how it can enhance supply chain and logistics planning.
Sven F. Crone is an Assistant Professor in Management Science at Lancaster University, UK, where his research on business forecasting and time series data mining has received over 40 scientific publications and international awards for developing novel forecasting algorithms. As the co-director of the Lancaster Research Centre for Forecasting, one with 15 members the largest research units dedicated to business forecasting, he and his team regularly take state-of-the-art forecasting research and apply it in corporate practice. He has trained over 500 corporate demand planners, and consulted with industry leaders on improving forecasting methods, systems and processes. Sven is also a regular keynote speaker at academic and practitioner conferences, sharing insights from hands-on consultancy projects on research in artificial intelligence and machine learning for FMCG/CPG, Call Centres and Energy Markets.
This presentation outlines a framework for improving the on-time performance of a public transport provider from a practitioners’ point of view.
In setting the goal for the on-time performance we differentiate between the punctuality for the passenger journey including transfer between trains and the punctuality for each service (train). Besides a differentiated goal you need a simple cost estimate per minute delay, again for a passenger minute and a train minute.
The basic analytic work is to carry out various comparisons of scheduled versus actual values for travel times between stations, stopping times at stations, transfer times between services, maintenance duration or rotation plans for trains. For each analysis, we filter the erratic component (as represented by the standard deviation) from the systematic “plan-error” component (difference between the mean value and the plan value). This leads to the two basic directions to take for improving the on-time performance:
This implies departing from static scheduling as introduced in the 50ies and still employed by most major railways and introducing a dynamic scheduling approach using OR methods.
The main obstacle to adjusting the schedule to accommodate the systematic errors is an overcrowded train system with very limited room for time-shifting train paths ( schedules). We discuss options to solve this impasse within the given framework through a comprehensive optimization approach (unfreeze the system).
Dr. Christoph Klingenberg studied mathematics and computer science at the universities of Hamburg and Bonn, Germany and spent postgraduate research at the universities of Cologne, Germany and Princeton and Harvard, USA. He then joined McKinsey&Company for a career in top management consulting for 6 years. The major part of his professional live he spent with Lufthansa German Airlines in various strategic, planning and operational positions. In 2014 he joined Deutsche Bahn (German railways) and currently heads the strategic division programs for the group including projects for autonomous driving, European train control systems and improved operational performance.
Dr. Klingenberg is married with 3 children and lives in Mainz, Germany.
Optimization, as a way to make "best sense of data" is a common topic and core area in operations research (OR), in theory and applications. Machine learning, being rather on the predictive than on the prescriptive side of analytics, is not so well known in the OR community. Yet, machine learning techniques are indispensible for example in big data applications. We start with sketching some basic concepts in supervised learning and mathematical optimization (in particular integer programming). In machine learning, many optimization problems arise, and there are some suggestions in the literature to address them with techniques from OR. More importantly, we are interested in the reverse direction: where (and how) can machine learning help in improving methods for solving optimization problems and what is it that we can actually learn? We conclude with an alternative view on this presentation's title, namely opportunities where predictive meets prescriptive analytics.
Marco Lübbecke is a full professor and chair of operations research at RWTH Aachen University, Germany. He received his Ph.D. in applied mathematics from TU Braunschweig in 2001 and held positions as assistant professor for combinatorial optimization and graph algorithms at TU Berlin and as visiting professor for discrete optimization at TU Darmstadt.
Marco's research and teaching interests are in computational integer programming and discrete optimization, covering the entire spectrum from fundamental research and methods development to industry scale applications. A particular focus of his work is on decomposition approaches to exactly solving large-scale real-world optimization problems. This touches on mathematics, computer science, business, and engineering alike and rings with his appreciation for fascinating interdisciplinary challenges.
Looking at network flow problems from a combinatorial point of view the flow is typically assumed to be constant in time and flows without additional requirements such as pressure differences. This is no longer true if we look at energy networks such as water or gas networks. To appropriately model the physics of these flows partial or at least ordinary differential equations are necessary resulting even in simplified settings in non-linear non-convex constraints. In this talk we look into the details of such models, motivate them by problems showing up in the transmission of the energy system and present first solution approaches with many hints to future challenges.
Alexander Martin studied Mathematics and Economics at the University of Augsburg. He finished his PhD and habilitation theses at the Technische Universität Berlin and was debuty head of the optimization group at the Zuse Institute in Berlin. From 2000 to 2010 he became professor for discrete optimization at the Technische Universität Darmstadt, where he has been vice president from 2008 to 2010. Ever since he heads the chair on "Economics, Discrete Optimization, Mathematics (EDOM)“ at the University of Erlangen-Nuremberg.
He has been member of two cooperate research centers, the graduate school of excellence Computational Engineering and several networks supported by German ministries (BMBF and BMWi) and is currently the speaker of the cooperate research center "Mathematical Modeling, Simulation and Optimization using the Example of Gas Networks“. Besides his editorial activities for several international journals he was managing editor for the journal "Mathematical Methods of Operations Research“.
He also received honoury appointments to the BMBF advisory board "Mathematics“. His research areas are the study and solution of general mixed-integer linear and nonlinear optimization problems comprising the development of appropriate models, their analysis as well as the design and implementation of fast algorithms for their solution. The applications result from the engineering sciences and industry including network design, transportation problems and energy optimization.
IT-based processes have fostered the rise of shared mobility business models in recent years. In order to play a major role in people’s future transportation, reliable shared mobility services have to be ensured. The availability of a shared vehicle at the point in time and location of spontaneously arising customer demand is recognized as requirement to replace individual vehicle ownership in the long term. Methodological support for shared mobility systems can draw on operations research models originally developed in the field of logistics. We give an overview on optimization models with regard to network design, transportation, inventory, routing, pricing and maintenance that have been adopted to operational support of shared mobility systems. For instance, the problem of relocating bikes over time in a station-based bike sharing system can be formulated as service network design problem. We show that next to the coverage of routing, the problem formulation incorporates inventory and transportation decisions. The fact that the number of bikes is kept constant over time is depicted by asset management constraints. A matheuristic is proposed to solve this problem to near optimality. Tailored techniques are to be developed in order to cope with these complex problems.
Dirk Christian Mattfeld is full professor of decision support in the business information systems engineering group at Technische Universität Braunschweig, Germany. His research focuses on the efficient use of resources in urban logistics and shared mobility. These interests comprise work in analytics, modelling and optimization. Dirk Christian Mattfeld has graduated from Universität Bremen and has been affiliated with Universität Hamburg and Technische Universität Braunschweig. With respect to GOR, he chaired the GOR working group on logistics and traffic from 2002 to 2010.
Blood is a unique product that cannot be manufactured, but must be donated, and is perishable, with red blood cells lasting 42 days and platelets 5 days. Blood is also life-saving. A multi-billion dollar industry has evolved out of the demand for and supply of blood with the global market for blood products projected to reach $41.9 billion by 2020. The United States constitutes the largest market for blood products in the world, with approximately 21 million blood components transfused every year in the nation and approximately 36,000 units of red blood cells needed every day.
Although blood services are organized differently in many countries, such supply chain network activities as collection, testing, processing, and distribution are common to all. In this talk, I will focus on the United States, but the methodological tools can be adapted to other countries. Specifically, in the US, the blood services industry has been faced with many challenges in the past decade, with a drop in demand for blood products and increased competition. Revenues of blood service organizations have fallen and the financial stress is resulting in loss of jobs in this healthcare sector, fewer funds for innovation, as well as an increasing number of mergers and acquisitions.
In this presentation, I will overview our research on blood supply chains, from both optimization and game theoretic perspectives. For the former, I will highlight generalized network models for managing the blood banking system, and for the design and redesign for sustainability. In addition, a framework for Mergers & Acquisitions (M&A) in the sector and associated synergy measures will be described. A case study under the status quo and in the case of a disaster of a pending M&A in the US will be presented. Finally, a novel game theory model will be highlighted, which captures competition among blood service organizations for donors.
Anna Nagurney is the John F. Smith Memorial Professor at the Isenberg School of Management at the University of Massachusetts Amherst and the Director of the Virtual Center for Supernetworks, which she founded in 2001. She holds ScB, AB, ScM and PhD degrees from Brown University in Providence, RI. She is the author or editor of 13 books, including the new book, "Competing on Supply Chain Quality: A Network Economics Perspective," with Dong Li, more than 180 refereed journal articles, and over 50 book chapters.She presently serves on the editorial boards of a dozen journals and two book series and is the editor of another book series.
Professor Nagurney has been a Fulbrighter twice (in Austria and Italy), was a Visiting Professor at the School of Business, Economics and Law at the University of Gothenburg in Sweden for the past 4 years, and was a Distinguished Guest Visiting Professor at the Royal Institute of Technology (KTH) in Stockholm. She was a Visiting Fellow at All Souls College at Oxford University during the 2016 Trinity Term. Anna has held visiting appointments at MIT (at the Center for Transportation and the Sloan School of Management) and at Brown University and was a Science Fellow at the Radcliffe Institute for Advanced Study at Harvard University in 2005-2006. She has been recognized for her research on networks with the Kempe prize from the University of Umea, the Faculty Award for Women from the US National Science Foundation, the University Medal from the University of Catania in Italy, and was elected a Fellow of the RSAI (Regional Science Association International) as well as INFORMS (Institute for Operations Research and the Management Sciences) among other awards. She has also been recognized with several awards for her mentorship of students and her female leadership with the WORMS Award, for example. Her research has garnered support from the AT&T Foundation, the Rockefeller Foundation through its Bellagio Center programs, the Institute for International Education, and the National Science Foundation. She has given plenary/keynote talks and tutorials on 5 continents. She is an active member of professional societies, including INFORMS, POMS, and RSAI.
Financial markets, banks, currency exchanges and other institutions can be modeled and analyzed as network structures. In these networks nodes are any agents such as companies, shareholders, currencies, or countries, and edges (can be weighted, oriented, etc.) represent any type of relations between agents, for example, ownership, friendship, collaboration,
influence, dependence, and correlation.
We are going to discuss network and data sciences techniques to study the dynamics of financial markets and other problems in economics.
Panos M. Pardalos serves as distinguished professor of industrial and systems engineering at the University of Florida. Additionally, he is the Paul and Heidi Brown Preeminent Professor of industrial and systems engineering. He is also an affiliated faculty member of the computer and information science Department, the Hellenic Studies Center, and the biomedical engineering program. He is also the director of the Center for Applied Optimization. Pardalos is a world leading expert in global and combinatorial optimization. His recent research interests include network design problems, optimization in telecommunications, e-commerce, data mining, biomedical applications, and massive computing.
Meta-algorithmics is a subject on the intersection of learning and optimization whose objective is the development of effective automatic tools that tune algorithm parameters and, at runtime, choose the approach that is best suited for the given input. In this talk I summarize the core lessons learned when devising such meta-algorithmic tools.
Meinolf Sellmann received his doctorate degree in 2002 from Paderborn University (Germany) and then went on to Cornell University as Postdoctoral Associate. From 2004 to 2010 he held a position as Assistant Professor at Brown University and was Program Manager for Cognitive Computing at IBM Watson Research. Now he is Technical Operations Lead for Machine Lerning and Knowledge Discovery at General Electric.
Meinolf has published over 70 articles in international conferences and journals, filed nine US patents, served as PC Chair of LION 2016 and CPAIOR 2013, Conference Chair of CP 2007, and Associate Editor of the Informs Journal on Computing. He received an NSF Early Career Award in 2007, IBM Outstanding Technical Innovation Awards in 2013 and 2014, and an IBM A-level Business Accomplishment Award 2015. For six years in a row, Meinolf and his team won at international SAT and MaxSAT Solver Competitions, among others two gold medals for the most CPU-time efficient SAT solver for random and crafted SAT instances in 2011, the best multi-engine approach for industrial SAT instances in 2012, the overall most efficient parallel SAT Solver in 2013 (at which point portfolios were permanently banned from the SAT competition), and 17 gold medals s at the 2013 to 2016 MaxSAT Evaluations.
Last-mile logistics providers are facing a tough challenge in making their operations sustainable in the face of growing customer expectations to further decrease lead times to same-day or even same-hour deliveries, and/or to offer narrow delivery time windows. The providers respond to this challenge by investing in their analytic capabilities to make their last mile logistics are efficient and intelligent as possible.
In addition, innovative and disruptive business models are currently on trial, e.g. asset-lean start-ups use crowdsourced drivers or drivers on demand-dependent contracts. Several companies are experimenting with delivery drones or robots, and how to collaborate with each other ('shared economy').
Many of these developments entail exciting new challenges for operations researchers. In this talk, I will review some of the most recent developments and reflect on future research directions.
Dr Arne K. Strauss is Associate Professor of Operational Research at the University of Warwick (UK). He holds a Ph.D. in Management Science from Lancaster University (UK), an M.Sc. in Mathematics from Virginia Tech (USA), and a diploma in Business Mathematics from the University of Trier (Germany).
His main research interests are price optimization, demand modelling and demand management. He worked with various industrial partners including e.g. Lufthansa Systems or the retailer Ocado. He won several prizes for his research including the doctoral prize of the British Operational Research Society for the best PhD dissertation 2009. He is a joint recipient of an IBM Faculty Award in 2015 worth $20,000.
Currently, his work focusses on using demand management to improve last-mile logistics. He also leads a team at Warwick on a Horizon 2020-funded project on capacity ordering and price control in the context of air traffic management.
Nowadays, route planning systems belong to the most frequently used information systems. The algorithmic core problem of such systems, is the classical shortest paths problem that can be solved by Dijkstra's algorithm which, however, is too slow for practical scenarios.
Algorithms for route planning in transportation networks have recently undergone a rapid development, leading to methods that are up to several million times faster than Dijkstra’s algorithm. For example, for continent-sized road networks, newly-developed algorithms can answer queries in a few hundred nanoseconds; others can incorporate current traffic information in under a second on a commodity server; and many new applications can now be dealt with efficiently. Accordingly, route planning has become a showpiece of Algorithm Engineering demonstrating the engineering cycle that consists of design, analysis, implementation and experimental evaluation of practicable algorithms.
Recently, new aspects like multimodal route planning, personalized journey planning with respect to multiple criteria or energy-aware route planning for electric vehicles come up. This talk provides a condensed survey of recent advances in algorithms for route planning in transportation networks.
Dorothea Wagner is a full professor for Informatics at the Karlsruhe Institute of Technology (KIT). Her research interests include design and analysis of algorithms and algorithm engineering, graph algorithms, computational geometry and discrete optimization, particularly applied to transportation systems, energy systems, network analysis, data mining and visualization.
Among other activities she is member of the German Council of Science and Humanities (Wissenschaftsrat). From 2007 to 2014 she was vice president of the DFG (Deutsche Forschungsgemeinschaft - German Research Foundation) and 2004 to 2013 speaker of the scientific advisory board of Dagstuhl - Leibniz Center for Informatics. In 2012 she received a Google Focused Research Award, she is member of Academia Europaea, of acatech - National Academy of Science and Engineering, and Fellow of the GI (Gesellschaft für Informatik).
Dorothea Wagner obtained her diploma and Ph.D. degrees from the RWTH Aachen in 1983 and 1986 respectively; and 1992 the Habilitation degree from the TU Berlin. 1994 - 2003 she was a full professor at the University of Konstanz.
Data Analytics, Artificial Intelligence and Digitalization are general megatrends in research and industry. The big data trend was initiated by the increasing computer power and the internet. But data alone are only information about the past, we have to find structures and causalities between the variables. Based on such models we can compute information about possible futures and go on to decision support or control. The view of artificial intelligence is to solve the above problems with human analog methods. Especially neural networks and deep learning play an important role in such an effort. Siemens has a focus on technical and not internet applications, so we call our development machine intelligence instead artificial intelligence. Finally, digitalization describes the way from physical processes (and production lines) to virtualization, using the digital copy of the processes for optimization and online control.
In a final part of the talk I will show in form of examples, that in an industrial research center research plays an important role: First, we have to confront problems which were unsolved otherwise. Second, the knowledge accumulation in lasting teams opens unique selling points for the company.
Dr. Hans Georg Zimmermann, Study of Mathematics, Computer Science, Economics at University of Bonn (PhD in game theory). Since 1987 at Siemens, Corporate Research in Munich. Founding member of the neural network research at Siemens (starting 1987). Since 2000 Senior Principal Research Scientist, scientific head of the neural network research with applications in forecasting, diagnosis and control. Member of GOR, DMV, DPG, advisor of the National Science Foundation in US, lectures and talks at universities on all continents.