To download the presentations, please refer to the German version of the program.
28 Feb 2018
09:25 — 09:30
Stage 1 | Opening
09:30 — 10:00
Stage 1 | EN
10:00 — 10:30
11:00 — 11:30
11:30 — 12:00
12:00 — 12:30
12:30 — 13:00
14:30 — 15:00
Stage 4 | EN
14:30 — 16:00
Stage 8 | Workshop
15:00 — 15:30
15:30 — 16:00
Stage 6 | EN
16:30 — 17:00
16:30 — 18:00
Retail & Logistics
17:00 — 17:30
17:30 — 18:00
Stage 7 | EN
tba | EN
1 Mar 2018
08:55 — 09:00
09:00 — 09:30
10:30 — 11:00
Stage 2 | EN
10:30 — 12:00
13:30 — 14:00
Stage 3 | EN
13:30 — 15:00
14:00 — 14:30
15:30 — 17:00
16:00 — 16:30
Opening by broadcast journalist Sarah Harman
The key challenge with predictive analytics is their disruptive nature: companies must simultaneously handle technology and business model changes. This risk is reduced by starting with those analytic services where the company has the most mature delivery capability, expertise and a solid data basis. Following this paradigm and using real-life machine data for typical industrial processes such as drilling, material handling the presentation shows how typical machine learning algorithms can be used to setup reliable analytics services for typical business questions such as maintenance or tool change time forecasting. It will also give a guidance which algorithms should be used for which cases. Statistical algorithms as an effective alternative to machine learning are also covered. In this way, the presentation focuses on how to start with simple machine learning algorithms and scale up to the more complex ones later to technically manage the introduction of analytics into the organization. The presentation will also briefly treat the role of KUKA’s Industry 4.0 accelerator team to manage such innovations, thus reducing the time to market.
The evolvement of big data and digital transformation will change the health sector and confront the statutory health insurance with new challenges. Medical care, the profile of medical practitioners and the roles of patients will be heavily transformed by the digitalization process.In the future, the statutory health insurance and the TK will consequently not only supply health benefits in kind, but also a centralized health platform. As of today, TK already offers innovative health programs, reaching from digital migraine therapies to an electronic health record and thus meeting a growing demand by insured customers. Furthermore, TK has implemented digital and analytics units to organize the digital transformation and to use methods of „artificial intelligence“ to provide additional benefits for its insured customers.In order to actively shape the digital transformation in the health care sector, statutory health insurances will have to actively invest in structures, know-how and culture.
The presentation „Swimming with the sharks – Standing up against tech giants with courage and artificial intelligence“ will address myths of the digital transformation and illustrate the usage of technology and data within Ringier with practical examples. Ringier is a multinational media concern present in 16 countries within Europe, Asia and Africa and is headquartered in Switzerland. Founded in 1833, Ringier is nowadays successfully running Print and Online Media, Entertainment, E-Commerce and Classifieds. But how is Ringier mastering the challenges of the digitalization? The presentation outlines, how Technology & Data is used as “Enabler” to build an Ecosystem Strategy and how the implementation of an own semantic engine and the effective usage of Artificial Intelligence can help standing up against established international tech giants.
Digital transformation through Artificial Intelligence (AI) is no longer a question of "if" for companies, but just a matter of "when". This push is being driven by the big giants and startups, who are already leading the game. But the reality is that any company with the right data can start implementing AI technology for their business today. With the exponential growth of data, increasing customer expectations and tough trading conditions, sustainable and profitable growth is getting essential. Artificial Intelligence can evaluate the many complex influences such as weather, promotions, and customer experience to deliver the best prediction of profitable outcomes for business at scale. Data driven Artificial Intelligence is the foundation for optimized pricing, merchandising and supply chain processes. Every fine grained transaction and every interaction can be crucial for success.
Hewlett Packard Enterprise's (HPE) “The Machine" research program – one of the biggest research projects in the IT pioneer's history – aims at overcoming the limitations of today's computer architecture. As a result from this research, HPE launched the first prototype of a radically new computer architecture. The core of this computer architecture is no longer the processor – it is a large pool of memory.Biomedical Science and computer technology are increasingly coming together to solve critical health challenges. One such partnership between Hewlett Packard Enterprise (HPE) and the German Center for Neurodegenerative Diseases (DZNE) is applying cutting-edge technology to significantly accelerate research into neurodegenerative diseases such as Alzheimer's - a disease that affects more than 47 million people worldwide. By using advanced computing technology and new computational algorithms, HPE and DZNE have been able to significantly cut-down the time required for genome analysis. This will benefit not only real time analyses of human genomes for research on neurodgenerative diseases, but will improve overall personalised diagnostics.
Each Data Transformation Journey is different, but there are some solution patterns that can help in traditional industries. The presentation will explore the following questions on how to drive the change towards a truly data driven enterprise: • How can you implement data at the heart of your business to enhance your customer's experience of your brand? • How to set the right environment to launch your data projects in an agile way that allow you to fail fast and move on for a quicker route to success? • How can you create a modern data culture that will revolutionise company operations to get the most out of working with your data assets and drive your business forward? • How can you keep the right balance between investments to incorporate new technologies (e.g. Artificial Intelligence and Blockchain Technologies) and the basic groundwork to make sure your company is ready to absorb them?
New services and products of the digital disruption create new information quality and unpredictable growth of data. Mobile devices leverage data that is stored physically in the cloud and geographically dispersed. But how do we operate and manage that data?
For today’s systems fast moving data is a major challenge. Today insights are expected at the moment of the event. We need new strategies and platforms to cope with those requirements. Concepts like data-lake and platforms like Hadoop don’t fit well for standards of global acting companies.
Today we can create and use data from context of man & machine from any point on earth.
This talk Datastax discusses requirements and solutions for platforms that ensure data is managed secure, distributed and scalable, even for realtime analytics. Successfully implemented projects at ING Bank for Customer Experience and Logistics company Traxens for IoT based on DataStax Enterprise will be presented.
Smart elevators - how Kone makes the elevator and escalator service smarter with Watson IoT, optimizing maintenance and uptime."Smart Buildings" is a topic with a great future. The elevator and escalator specialist KONE uses Watson IoT to connect its elevators and escalators and remotely monitor them. Among other things, the company provides a 24/7 service based on the Watson IoT platform for its intelligent maintenance.
Christoph Großbaier, Head of Product Marketing at Celonis, explains how process mining works. Hear about the new big data technology that analyzes your company's data to uncover process weaknesses and optimization potentials. Learn how companies such as Vodafone or Zalando are using process mining to raise the bar in purchasing, sales, supply chain and customer service.
The space and defence industry is facing unprecedented opportunities and challenges in a world that is becoming more complex and less predictable. Airbus is right at the center of this once-in-a-generation change. Digital and data is core to this multi-year transformation.In his talk, Jürgen Urbanski, VP Digital Platforms, will share a few of the lessons learned from transformations such as the one experienced by Airbus. He will focus on systemic issues there are prevalent in many large enterprises. The actionable insights for the audience will focus on how to make digital & data teams more self-sufficient with respect to extracting business value from their data, how to accelerate the digital transformation across the enterprise, and how to best leverage the value of a software ecosystem, using examples from aerospace and defense.
From idea to final product: In this session, you will learn how we implement Big Data use cases in interdisciplinary teams of Data Scientists and Big Data Engineers. You will get an insight on how we organize our teams and how we’ve built our Big Data Platform. I will talk about selected use cases and cover the following areas • how we integrate data streams across data centers and companies to build realtime applications • how we use Kafka as Stream Data Platform to connect on-prem on cloud data centers • how we deploy data science models into production • our experiences with Big Data products in the Google Cloud Platform
DB Station & Service AG operates approx. 5,400 train stations in Germany. Daily 19 million passengers and visitors rely on our technical facilities for orientation, information, mobility, security and connectivity. In order to maintain our facilities efficiently and effectively, we are increasingly using IOT technologies.The Internet of Things technology makes a significant contribution to speeding up our activities. We pave the way there by cooperating with industry, promoting start-ups or in-house development of new systems. In order to further improve the reliability of our technology for our customers, we rely on active asset management consisting of monitoring, data analytics, pattern recognition and continuous improvement. One quick win comprised of getting the professionals in charge of their data via data integration of different platforms in a dashboard application.A challenge for the introduction of new technologies for us in the huge system inventory. Without large investments, a relevant penetration of our assets will take too much time. Here we test different strategies for introducing and implementing attractive use cases.
The Deutsche Bahn invests heavily in energy efficiency. To that end, the most important drivers of energy consumption are identified and appropriate measures for increased efficiency are deduced. For planning purposes, it is furthermore necessary to predict energy consumption as accurately as possible.Both retrospective analyses and prospective forecasts suffer from the potential bias caused by outliers. Such outliers may exist because of measurement errors at the sensor level or because of errors in the allocation of energy consumption to the respective train runs.Therefore, Zero.One.Data, the Big Data Startup of DB Systel, is using Data Science to identify and handle outliers, improve data quality and analyses as well as forecasts.
Working with data is different and challenges companies in a variety of ways to discard old habits ("learn to unlearn") and learn new ones. Installing a data lab alone is not enough, and it takes more than just data scientists and deep learning to generate sustainable business value from data & AI. In our talk you will learn about the key ingredients for building up and developing enterprise-ready data & AI teams and spaces, but also about toxic substances to destroy them. It is a game of thought in which we want to re-think approaches, share experiences and next practices and leave our everyday comfort zones. Looking into the future we already see that it is not about skills, technologies, labs or CDOs. It is about people, spaces and structures as well as their continuous reflection and conscious moderation in highly dynamic - we call it "fractal" - systems. Dohmann & Bollhoefer have already built several data labs, accompanied them into life and are currently in the process of further developing some of them in the direction of a factory, a hub or platform.
The complexity of digital marketing is increasing rapidly. This is in part due to the growing number of communication channels and technologies (Amazon, digital assistants such as Alexa, etc. ), and in part due to the consumers themselves: Buying decisions are made on various channels and devices throughout the customer journey (cross-channel and cross-device marketing). Data-driven marketing is the central success factor in implementing digital transformation and interacting with customers in the best possible way. Andri Fried (Head of Online Marketing, Marley Spoon GmbH) and Frank Rauchfuß (Managing Director and CEO, intelliAd Media GmbH) will demonstrate how businesses can significantly increase their turnover with big data customer insights and data-driven marketing. The experts will give decision makers and online marketing managers specific recommendations for action: Merging the online and offline worlds, mobile first and growth strategies are just some of the current trends that will be addressed.
Sensors surge and big data grows. Two billion smart phones and many more GPS sensors are active today. What if we equip all cars, ships and aircraft with environmental sensors? How can we gain knowledge from all that data? How can we find the needle in this big digital haystack? In real-time?Visual Analytics, the science of analytical reasoning facilitated by interactive visual interfaces, can be a key enabler to tackle this challenge. With the right data architecture, analysis tools and visualization capabilities, any user can be turned into a powerful analyst who can make sense out of today’s mass amounts of geo-enabled and temporal data. To be able to turn ordinary users into extraordinary analysts it is required to understand the different dimensions of data in the context of the challenges of Visual Analytics. The power of accelerating insight through Visual Analytics (practical relevance) will be interactively demonstrated by Lufthansa Systems: Lufthansa Systems synchronizes planning data with historical and real-time positional data and weather forecast. This data combination is projected into space and time. Based on the analysis Lufthansa Systems examines inflight decision support for airline dispatch and operational analysis through visualization of the respective context.
Industry 4.0 relies on intelligent and digitally networked systems. Plants, people, machines, production goods, components and sensors - all beings and things are located in a certain place at all times. The spatial and time reference therefore plays a central role in the digital planning and control of processes. The consideration of this spatial and temporal reference of data requires specific methods and technologies. The integration of geoinformation systems (GIS) enables the transformation of spatial reference systems, spatial binning, heat maps, interpolation or spatial join of data. In addition, there are many other methods of space-time modelling, which are used for analysis, prediction and decision support, and visualization in two- to four-dimensional interactive maps and space-time cubes. The article provides an overview of the challenges and technical solution concepts for processing Big Geospatial Data with GIS technologies and illustrates/demonstrates this showing the developments at CLAAS to show the added value that is created for applications in industry.
It gives an overview of the situation in the environment of big data at the railway. From a small retrospective into the past the way is shown how the topic was touched from an organizational and technological point of view. Here were both financial and personnel challenges to deal with. The special challenges in the area of big data are also due to the manifold possibilities, so that the start is a special challenge especially in this area. Launched as a small plant, the challenge was to scale relatively quickly due to the high demand. The result is a data lake that collects data across the Group and distributes it efficiently for multiple use.
Mobility will be increasingly supported by intelligent mobility applications. Access to data is critical for establishing integrated, efficient and sustainable transport systems. “Open data” offers great potential in that context. The German Federal Ministry for Transport and Digital Infrastructure (BMVI) is particularly committed to “open data” and has adopted this principle early on.Besides the provision of data as such, it is also essential to turn it into practical use and new applications. There are many creative startups with excellent ideas, yet frequently they lack sufficient capital for the risky start-up phase. Therefore, financial support is needed to advance new ideas. With the mFUND, BMVI has set up an ambitious grant program for encouraging data-based innovations in the field of mobility. Up to 2020, EUR 150 million is available to support the development of innovative software applications that are based on available or new to be collected data. In addition, BMVI is implementing a range of networking events that enable the participation of start-ups and creatives, especially from the sciences, to expand the reach of the mFUND and facilitate the exchange of knowledge and new innovations- program.
You are member of a small team of consultants, who wants to acquire a new client. You team consists of several different specialists such as data engineers, data scientists and business analysts, and in about one hour you are expected to present your findings and initial ideas to the board members of the potential client. But you just have received new data and due to trouble with another project the preparation for your meeting just starts now. The meeting is approaching fast ... are you able to get the required insights with your team in the next 70 minutes?
In this workshop, you will work in mixed teams of 3-5 persons. To analyze the data of your potential client, the Watson Data platform tools addressing the different roles will be provided. After a 10 minutes introduction, it's your turn. You have 70 minutes to gather as much insights as possible. Finally, there will be a 10 minutes feedback session, where we would like to know about your experience in this true ad-hoc scenario and if you feel prepared for the meeting. This workshop is targeted to all analytics user roles, coding skills are not required. There are also tools that can be used without coding in a visual oriented manner, and through the mixture of each team, everybody has an opportunity to also get insights how e.g. a data scientist approaches the challenge and get insights into his work. Example and data will be provided to facilitate your analysis and a team of expert mentors will be available for questions all the time. If you want to experience analytics, here's the right place
Diese Seite benötigt die Unterstützung von Frames durch Ihren Browser. Bitte nutzen Sie einen Browser, der die Darstellung von Frames unterstützt, damit das Ticketvorverkaufs-Modul angezeigt werden kann.Probieren Sie die XING Events online Registrierung noch heute aus.
Diese Seite benötigt die Unterstützung von Frames durch Ihren Browser. Bitte nutzen Sie einen Browser, der die Darstellung von Frames unterstützt, damit das Ticketvorverkaufs-Modul angezeigt werden kann.
Probieren Sie die XING Events online Registrierung noch heute aus.
Digitalization is in full swing. It affects the very personal environment and influences how we communicate, how we listen to music and of course how we shop. Customers in the 21st century no longer differentiate between the individual channels and shop from everywhere. They switch between the online and offline worlds as a matter of course. Customers in the 21st century are shopping everywhere, but they are also increasingly accustomed to the convenience of e-commerce. But the stationary store is catching up. The smartphone as a constant companion in everyday life and a bridge between the two worlds is crucial. And the stationary store is even ahead of the e-commerce: customers still want to see and feel products. The purely "digital customer" does not exist. That's why "digital" at PAYBACK does not mean e-commerce: the future of a digital stationary retail is playing out in digital brick-and-mortar stores, which are more attractive than "just" online and more digital than the internet.
Artificial intelligence (AI) will change companies dramatically: Robots are steered accurately through factories. Software manages employees better than humans do. Digital assistants provide board members with strategic options. KI is considered the spearhead of digitization and will transform any company from within. At the same time, it is difficult to grasp. On the one hand, we are already discussing whether AI would take away jobs from us. On the other hand, we still do not agree on a formal definition and test of AI. Maybe we have to talk about it differently. Computer science researcher Kris Hammond sees AI as the sum of elements that cover intelligent subfunctions. He defines 28 elements which can be combined as needed. In analogy to chemistry, he calls this a "Periodic Table of the AI". It assists decision-makers in making informed statements about opportunities and risks without getting lost in technical details. The Periodic Table of Artificial Intelligence helps to specify value-added applications for digitization and makes it easier to compare vendor offerings.
This keynote will be held twice.
One of the main current challenges for utility companies is the increasing amount of data and rapidly developing degree of digitalization in all sectors of energy demand and supply. As part of the publicly funded research project ENERA, the EWE data science team discovers and evaluates possibilities and potentials arising from this digital revolution. For that purpose, our team developed the agile and explorative approach “BrainWave” which is in the focus of this presentation. As an example, we will illustrate how data-based potentials can be accessed by showing the results of a study comparing the performance – both qualitatively and economically – of multiple machine learning techniques for the forecast of energy consumption of large consumers.
In this talk we will give an example about a machine producer company and the current challenges related to Industry4.0. We explain how IoT, BigData / Predictive Analytics & Maintenance could be a good chance to generate new business insights for developing new business models. Learn how you could use your data to improve maintenance process and quality with better diagnosis and additional prediction. Opitz Consulting will illustrate the project context, the approach and the key success factors.
Data Analytics lives from the availability of valuable data, as much as possible. Data sovereignty is the leading concept in making even sensible data available. With the Industrial Data Space Association 100+ member companies develop a common reference architecture as blue print for data sovereignty. Deutsche Telekom is going to implement a data marketplace use case for analytic matters based on the blue print. Finally, Salzgitter Group is going to apply this marketplace to share sensible supply chain data for competitive advantage. Exciting story, join a data economy experience.
Naturally, when operating charging stations, a lot of data is generated. Together with Know-Center, has.to.be GmbH is working on possibilities and technologies for analysing and subsequently visualizing this data. The objectives of the joint research project are, among other things, based on the historical consumption data to provide a consumption forecast, in order to allow the operators of charging stations a cost-optimized energy supply. Furthermore, the data were analysed with regard to "predictive maintenance" with the aim of optimizing the availability of the charging stations. In addition to the previous mentioned use cases, it was also attempted to determine prediction information for the end user (e-car owner) regarding the availability of charging stations. The talk shows how these tasks could be solved with the use of (big) data technologies and which challenges were connected with them.
The analysis of large graph databases such as Wikipedia, biological networks, or social networks opens the potential to generate information about customers and competitors, new treatments, or potential attacks. However, the existing graph databases reach their performance limits with very large graphs and do not provide sufficient support for typical analysis workflows. The KNIME Analytics Platform in combination with Gradoop - a framework for distributed integration, analysis, and storage of very large graphs - offers an innovative approach to the user-friendly and efficient analysis and visualization of big graph data. The actual execution takes place transparently for the users on scalable big data infrastructures, whereby feedback on calculation progress, intermediate results, and analysis results are visualized directly in the tool. We demonstrate the approach based on a use case to analyze the citation relationships of patents.
Volkswagen Commercial Vehicles is intensively involved in the digitisation of its products and services. In particular, the Connected Van and the connectivity services based on it focus on the aspects of user interface, vehicle data and geoinformation in real time. It is well known that this leads to new challenges for the strategy of data processing and methods.With the increasing volume of data, complexity and temporal validity of information, today's established technologies are confronted with requirements that reach their limits both technologically and in their approach. The "Smart Data Learning Group" was founded in 2016 in the environment of Volkswagen Commercial Vehicles in order to discuss and technologically investigate the problems, different approaches to solutions and the associated customer benefits.Since the technological solution approach is a focus topic, several innovative technologies including the self-learning CortexPlatform were examined as "proof of concept" based on the open data source NYC Taxi Trips. The data source has been fully normalized (NF6), encompasses over 1.4 billion taxi journeys and provides various metadata for each taxi trip, such as date/time, geo-coordinates, distance travelled, journey time, number of passengers, information on fares and type of payment, and will be evaluated LIVE with the audience.
The energy industry is undergoing a significant process of change. With the growth of distributed, decentralized energy resources, governments, suppliers and other stakeholders are experimenting with new solutions to manage the electricity grid more efficiently and enable the further development of renewable energy. After years in which the energy transition was primarily associated with the expansion of renewable energy, systemic issues are increasingly becoming more and more relevant. Generation, demand and price are more volatile and difficult to predict. The existing business models are fundamentally changing.
In car building, failure of critical machinery leads to down times of entire production lines. One minute of unplanned stand stills sums up to ca. 18,000€ loss! All analytics activities up to now have not been successful in discovering early indications for predicting car body press failures. Reason for this is high diversity of car models and resulting high complexity in variants. For data gathering and data availability, IoT, edge and could technologies were used. Un-supervised, self-learning algorithms analyzed that data. Multi-layer Artificial Intelligence methods discovered highly complex data structures in such a way that service technicians were informed about future failure in machinery. Damaged is fixed before it actually occurs. Unplanned down time is avoided. Costs of ca. 18,000€ per minute is also avoided.
Porsche Austria GmbH is a subsidiary of the Porsche Holding Salzburg. It is importing and distributing cars to dealers and customers across Austria. Through this activity they are able to utilize and gather all kinds of data sources from within the automotive area. Together with the Know-Center, Austria’s leading competence centre in the area of Data-driven Business and Big Data Analysis, Porsche Austria started to analyse customer- and market-data as well as historical car roll outs to create a forecasting model for future market behaviour independent of the branch or car model. For this, several data sources have been analysed: new, used and tactical vehicle licenses; buyer studies; sales data; additional market data. To evaluate the performance of the forecasting and prediction outcomes, multiple algorithms and methods have been applied. In the last stage, a prototype of two predictive models has been implemented with the focus on the comparison of a non-linear model to pure statistical models. The analysis showed that the linear model (SARIMA) outperforms the non-linear model in many cases although it does not utilize multiple data sources. Both models are combined and visualized in an interactive dashboard where multiple data variations as well as single brands can be observed in detail.
Enders, Germany’s biggest equipment wholesaler for the meat processing industry, is the one-stop source for everything butchery – and this one-stop approach also drives their big data strategy. Getting a grip on high volumes of data is just one side of the story though. Enders also wanted to leave generating lists behind and to establish a completely new type of data analysis. An SAP system – cloud-based SAP HANA – will be used to supply the data and to handle all operative tasks while Qlik Sense will provide the data analytics capabilities.Being a retailer, Enders will naturally start at the sales department. The other business units will follow suit step by step. The limited resurces pose the project’s biggest challenge. Therefore lean, practical solutions are key that are easy to use and deliver results fast. That is why Enders chose Qlik Sense.
Merck is a leading science and technology company in healthcare, life science and performance materials with 50,000 employees in 66 countries around the globe. The advance of digitization is driving our transformation as a science and technology company more than anything else. To derive actionable insights from its diverse data sources on a large scale, Merck has therefore embarked on the big data journey in 2015. Since then, over 100 business cases have been successfully implemented using the most relevant big data technologies (including Hadoop, Spark or Python). Our objective is to further develop and build a world class advanced analytics capability. It requires advanced skills in cutting-edge technologies for data integration, curation, transformation, modelling and analytics. Leveraging big data technologies in a corporate context requires aggregating data from diverse sources in different formats in a unified platform, so called “Data Lake”.
How can we quickly and effectively bring the available data into use within the daily work and decision-making? Online tracking data (website visits, banner views, ...) are widely available, or at least can be provided. But how can we use this data quickly and effectively? We introduce a framework that allows us to analyze every single customer journey, and to determine the impact of marketing campaigns on the individual touchpoints. In this talk, we host a live data meeting! On a realistic data set, we will live tackle concrete questions from daily meetings and derive real-time results from the data. The goal is to illustrate the power of fast data insights & analytics for your daily work. A close collaboration between Marketeer and the Data Analyst is often key to success.
The field of anomaly detection has great relevance for the digitalized economy.Data streams are analyzed to reveal unexpected behavior. Depending on the application, it may be necessary to record millions of characteristics. The BTC Unusual State Detection (USD) is a machine learning method that solves this problem in an innovative way: The normal (i.e., expected) state of a data stream is learned, so that deviations from this normal behavior can be reliablydetected by the system automatically in real time. The detection of anomalies can solve a variety of problems, such as the identification of attacks on computer networks. But even problems that are not associated with anomalies can often be solved with this technique. For instance, we approach the question of whether load curves allow identifying business customers suitable for more efficient lighting contracts, to the detection of anomalies.
Smart Data bring along great opportunities but equally great challenges for the enterprise: very often, the available data are unstructured, fuzzy, incomplete. Data driven analytics and decision-making struggles to remain transparent and explainable. Using examples from the financial service industry and legal tech, the talk shows how smart data can be generated and interconnected to create new types of analyses: - Intelligent analysis of claims and matter of the dispute in a legal expense insurance - Risk assessment of projects, insurance cases and investments - Compliance, RegTech and fraud detection in banking - Data driven customer and market analysis (know your customer)
Furthermore, the presentation offers an overview over relevant technologies, from text mining to semantic graph databases and transparent, verifiable self-learning approaches.
FlixBus is a European long distance mobility provider and a brand of the FlixMobility group. Since 2013, FlixBuses have offered a new, convenient and green way to travel which suits every budget. Thanks to a smart business model and innovative technology, the former startup with now more than 1000 employees has established Europe’s largest intercity bus network in less than four years now providing 200,000 daily connections to 1,400 destinations in 26 countries.A main driver behind the growth and scalability of the business model is FlixBus’ technology platform comprising various E-Commerce solutions, mobile apps, custom-built business applications and increasingly data-driven services and AI solutions. By means of exciting mobility use cases we’ll highlight essential technological and IT-organizational factors of this success story, especially the embedding of highly automated, loosely coupled data services into the business processes as well as data streaming and business events in combination with automation and virtualization technologies as the catalyst for transformation towards a modern macro architecture with asynchronously coupled self-contained systems.
We all know that. A bus is late and totally crowded. The next bus arrives almost at the same time and is almost empty. What might be annoying for yourself is a big challenge for Berliner Verkehrsbetriebe (BVG) that needs to be solved. Data about routes are a promising approach. But dealing with huge amounts as well as various types of data is anything but trivial. Explore how BVG approaches a solution by using agile methods and rapid prototyping with the aid of AR and cloud technologies.
With HDFS and HBase, there are two different storage options available in the Hadoop ecosystem. Both have their strengths and weaknesses. But neither HDFS nor HBase can be used universally. Often this leads to complex hybrid architectures. Kudu fills this gap and simplifies the architecture of Big Data systems. A large German bank uses a data platform based on Kudu and Cloudera's Enterprise Hadoop Distribution, to fasten credit processes. With great success, Kudu is used to analyse huge amounts of data and in addition as storage layer for several banking applications. This presentation depicts business and technical requirements, which had to be fulfilled and reasons why Kudu is a good choice to realise such requirements.
The strategy development for the digital transformation needs tools and the right mindset. Riegler & Co. KG is a specialist for compressed air technology and pneumatics that builds its central creation of value through data enrichment and processing among other things. To approached their strategy development in data management they used DataCanvas. Philipp Krebs, Riegler & Co. KG, and Tobias Brockmann, innoscale AG, will present this tool and the learning they gained by using DataCanvas across departments.
Big Data & Analytics are essential building blocks for a successful digitization. But how can these building blocks be implemented within existing organizations? The Deloitte Analytics Institute and Union Investment will provide insight into their approach and talk about the daily business of their implementations. Fundamentals of data driven business models are covered and typical challenges of grown organizations explained. We are going to present better practices and an approach for a sustainable strategy. Followed by insights on how to implement initial use cases, generate value and gain management commitment. The outlook covers how to further scale Big Data & Analytics across the organization and the connection with other important trends, such as Robotic Process Automation.
We are presenting a revolutionary identity management system where users – and users only – decide what happens with their data. This user-centric identity management system is based on derived blockchain and cognitive database technologies. In addition to this intrinsic security architecture, an intuitive GUI provides user-friendly management of identities, authorisation, authentication and knowledge. The user can always see who accessed his data and when, which authorisation was used and who issued it.Our identity management system creates a secure environment for all identities at a company, be it for persons, machines, processes or objects. What happens with each and every identity can be tracked throughout the entire lifecycle in a transparent and easy manner. This is all possible thanks to a database technology that does not require any defined structures, is zero-redundant and creates a global index. The database is structured much like the human brain. It adapts to the data entered and links the data bidirectionally according to the synapse principle. All data exists only once and even incomplete changes can be understood. The data is completely indexed and can be accessed in a matter of seconds without any time-consuming searches.
A modern vehicle is a ‘computer on four wheels’. The continuous collection of vehicle data facilitates the generation of innovative digital vehicle services. In analogy to the Quantified-Self-movement, the IT industry has already evolved a number of Quantified Vehicle startups backed by enormous amounts of risk capital. The proposed talk analyses the most prominent Quantified Vehicle startups and then continues with the ongoing shift from traditional business models (e.g. vehicle as a product) to new, data-driven business models (e.g. transportation as a service, digital services based on vehicle operation data). Hence, it will further enhance the known concept of industry 4.0 as “digitalization over the entire product lifecycle”.
A German premium car manufacturer uses several automatic identification, data collection and localization technologies from different suppliers in parallel. The corresponding software of the service providers leads to isolated applications without further integration and processing of the recorded data. As a solution, KINEXON RIoT was selected as a global localization platform for all production sites.By connecting various interfaces, KINEXON RIoT integrates localization data from its own sensor network as well as independent data from other sources (such as RFID or barcode) and converts these real-time events into relevant analyzes. The focus is on location-based information, processing up to 500,000 data points per second with minimal latency (<50 ms). This high performance is the foundation for automating real-time processes in the industrial sector.
In today’s BI landscape it is more important than ever to be able to combine data from different sources, even when from large volumes and heterogeneous data. But business users need fast access to real time data. This brings up a key question: Do we in fact need all that raw data persisted in a centralized DWH (RDBMS)? What if we could introduce a transparent layer to analyze data stored in other technologies, like Hadoop, S3, Redis, … via a virtual DWH?In this session a Hybrid Data Warehouse architecture is build up from scratch using a fictive company named “FastChangeCo”.
For small and medium sized enterprises (SMEs) it is much harder to make a new technology accessible for them, than for larger ones. SDSC-BW was initiated by KIT and SICOS BW and is financially supported by the Baden-Württemberg Ministry for Science, Research and Art to support SMEs in the use of smart data analytics. The presentation describes the course of action and presents a number of success stories (a.o. Hermle, Herrenknecht, Rolf Benz). The success story of Echobot, a company that uses smart data analytics to provide information for the automation of business processes to their customers, will be presented by their Chief Data Scientist.
An important topic, but much manual labor: current approaches in automotive after-sales spare part pricing are mostly confined to manual price adjustments and are thus error-prone and resource-intensive. Hundreds of thousands of prices are not examined actively and are subject to flat price adjustments. The talk will answer the question how machine learning can help systematically handling the pricing of spare parts while simultaneously conserving resources, with a focus on price elasticity and price-line-optimization. Another topic of the talk is how, while human price builders carry out rule-based, yet partly subjective, price adjustments, these price adjustments can be objectified based on existent data. Examples from practical experience will be provided in the form of user stories and lessons learned.
The global network of Lufthansa Technik Logistik Services GmbH (LTLS) moves thousands of aircraft spare parts every day. Volatility and shorter required cycle times make management of operations a challenging and important task. In order to reduce uncertainty and allow for better planning, LTLS creates a digital twin of each material movement that make all relevant information centrally available.Based on this newly available data a pilot project of LTLS in collaboration with researchers from the chair of logistics of the University Würzburg aimed at improving capacity management in the central goods receipt. The team uses machine learning techniques not only to predict demand but also to directly prescribe capacity for a specific day. This talk presents the challenges in creating a reliable data base, the main ideas for selecting and implementing machine learning models and the experience with agile project management.
Access to the latest market moving information is key to investors’ success in financial markets when transforming information into value. In this context, we present a framework for a high-frequency real estate price index which is updated on a daily basis. Our online-based real estate price index offers investors and policy makers an undelayed view on price developments in the German real estate market and therefore provides a significant information advantage versus existing real estate price indices. In the first part of our presentation, we will briefly motivate the advantages of our approach from an economic perspective followed by a detailed elaboration on our data collection procedure and data science approach to compute the index. Finally, we will highlight a few recent real estate price trends based on our index with a history of almost two years and we will provide an outlook on the chances and challenges of online-based real estate price indices.
IT-Projects in the public sector, especially those projects handling millions of financial transactions a month, are mission-critical and characterized by highly complex functionalities. Such projects have immensely high quality requirements. Accordingly, the test cases are extensive, complex and expensive. Embedded in-memory technologies dramatically reduce test duration and speed up the delivery process. This is just one use case for embedded in-memory technologies in the public sector. As part of our presentation, we will introduce a number of other use cases and demonstrate the challenges and solutions against the background of real-life practical examples.
MaaS - Mobility as a Service is one of the future mobility concepts. This includes, among other things, the modelling and evaluation of framework conditions such as the quantity for autonomous vehicles, accepted waiting times from the time of request or vehicle loading, as well as the simulation and control of autonomous vehicle fleets.Conventional production of the necessary matrices for origin and destination traffic and the traffic models based on them do not permit rapid roll-out worldwide. To achieve this scalability, mass data must be used. One approach to this is the use of movement data based on GPS, sensor or GSM. The lecture will give an overview of the current possibilities and limits of this data in traffic related questions. Furthermore, it is to be shown how the data must be processed and prepared to be usable in this environment.
AI and data technologies are revolutionizing not just how the businesses are operated and managed but envisioned. AI is already transforming numerous industries and serving as a foundation for novel business models. However, realizing such digital transformation in large organizations and well-established businesses is non-trivial and go beyond just building the data infrastructure or AI algorithmic competence. With some illustrative examples, this talk with highlight the changing industrial businesses and also focus on broader business and strategic challenges of digital transformation and provide some key lessons.
In many domains we see that the adoption of artificial intelligence (AI) is progressing at a rapid pace. Despite the strongly regulated environment the transformative potential for life sciences is enormous. Infusing machine learning across the value chain of life sciences companies provides the opportunity to unlock this potential. This includes significant improvements of current operations as well as the development of new patient-centric services and business models for the benefit of humanity. Using real life examples the speakers will highlight key imperatives of applying AI in the context of life sciences.
Fundamentals of Machine Learning covers the most important aspects like „supervised“ und „unsupervised learning”.The methods range from clustering to regression to natural language processing
During this talk the University Hospital of Schleswig Holstein and IBM will present how AI is changing the healthcare eco-system. What are the consequences when large internet providers own the largest data cohorts that will degine the medical research of the future? Where will Artificial Intelligence have the biggest impact? And what are the chances and strategies for a German University Hospital in a digital health economy? UKSH will present how a platform strategy combined with the concept of a joint innovation hub, helps them to accelerate their digtal agenda that focusses on digital medical services provision, robotic surgery and the creation of a keyboardless hospital.
“If the Wright brothers were alive today, Wilbur would have to fire Orville to reduce costs.”—Herb Kelleher, Southwest Airlines The ongoing digitalisation of the aviation industry opens new opportunities for personalised services by using data, advanced analytics, and artificial intelligence. Lufthansa was presented with the “Most Innovative Airline” award by Future Travel Experience and with the IATA Travel Platinum Award for it’s most advanced digital serve offerings. Marcel Kling will give insight about the program which puts Lufthansa into an industry leading position. This will include use cases that are driven by Data and AI. Marcel will explain why big data and advanced analytics are also at the heart of the Lufthansa Group strategy and business model to thrive in the market, create better service offerings, and bring more relevance into the customer communication.Andreas Ribbrock will explain in more detail how Lufthansa Group has created a data and analytics architecture to address the challenges like a short time to market for new ideas and supporting the iterative and industry grade approach to analytics and decision making. A key aspect of this approach is the combination of latest technologies with existing solutions plus adding new commercial products to the stack for ensuring cost-effective operations (“AnalyticsOps”) of advanced analytics and AI data-driven products. Combining technology choices with the right people and processes is key-topic Andreas will address.
2018 will be the year AI becomes real for medicine. We’re going to move from algorithms to products and think more about integration and validation, so that these solutions can move from concepts to real, tangible solutions for our doctors.We will examine as to which extend this prediction is going to become true, and will look at specific promises. We will provide an insight as to what AI topics might be impactful beyond medical imaging, giving a glimpse into fields like critical care, general medicine, pathology, ophthalmology etc.Finally, we will help to guide where practical implementations in real live healthcare settings would make sense today.
Artificial Intelligence was a predominant topic in 2017 compared to cognitive systems. Technological developments in deep learning and the increasing use of optimized hardware continue to drive the fantasies. Only a few startups do not pretend to better solve business problems using artificial intelligence.We will reveal the details of Artificial Intelligence, give insights into current trends and take a look at the developments of the coming years. Even today, data science offers great potential for companies and sales departments can benefit from the targeted use of Artificial Intelligence in particular. Based on concrete examples, we will show you how to better assess the potential of your customers, to win customers, to automate your sales processes and to introduce dynamic pricing into your company.
In the course of the next few years, enterprise systems will change significantly. By means of intelligent algorithms that process big amounts of data software will learn new things and continue to optimize itself. In addition, the corporate surroundings will change as well as the way we will work together. Join Markus Noga, Head of Machine Learning at SAP, and learn more about the intelligent enterprise, the future of business, and the symbiosis between the learning ability of machines and the human mind.
For online retailer OTTO the Otto Group Business Intelligence division developed a product-evaluation feature that is unique in e-commerce to date: Aggregated Reviews. Customers can now select the most important aspects from product assessments on otto.de. At the very beginning of the development process the BI-Team had to face numerous challenges, such as: How can the results be displayed to the user with added value? How to cope with text data that don’t follow grammatical principles? Or: How can dynamic aspects of reviews be extracted and well-aggregated automatically? Open source technology including theano (Deep Learning), scikit-learn (Machine Learning) or pandas (Dataframe/Data Analysis) finally helped the team to create a valuable product.Aggregated Reviews is available on otto.de since May 2017.
Sourcing Cloud Solutions by Supply Chain Experts for Supply Chain Experts. scoutbee’s sourcingOS is the operating system for buyers. Its three core cloud-based applications provide the ideal sourcing environment for procurers searching for the best suppliers worldwide. Each functionality has been developed in collaboration with buyers and supply chain experts from across the world and tailored to meet their needs. ARTIMIS, scoutbee’s Artificial Intelligence helps buyers, production planners and supply chain managers identify new innovative suppliers, achieve sourcing savings and secure the value chain. First we will have a look to the Supply Chain of 2020 and explain ARTMIS’s artificial intelligence process through a live demo. A case study will be used to demonstrate the practicality and success of the core technology in a live environment, highlighting challenges and benefits for manufacturing companies. Included with the implementation and LIVE demonstration with the user, all lessons learned alongside an in-depth look at the future of AI within supply chain is discussed.
Inspirient develops and markets an Artificial Intelligence (AI) to fully automate the analysis of business data. At Consorsbank, the German subsidiary of BNP Paribas, a joint proof-of-concept project was set up to evaluate the practical applicability of Inspirient's AI to Financial Service companies. Application scenarios included the automatic and adaptive detection of both online fraud (so-called "phishing") as well as money laundering. The primary objective of this project was to understand the AI’s ability to learn, its overall flexibility, and its generality in everyday banking use. In this talk, we briefly introduce Inspirient’s AI and discuss in how far the goals of the proof-of-concept project were achieved. We cover the necessary technical adjustments, the results we achieved, and the integration into corporate processes at Consorsbank. In particular, we focus on challenges and opportunities of using Cognitive Analytics and first-hand experiences from our cooperation.
Becoming the world market leader in the field of artificial intelligence by 2030 – this is the declared intention of the Chinese government. A package of measures affecting the economy, politics and the society has been introduced with the aim of stimulating research and the artificial intelligence industry in the People’s Republic. The intention of China is taken seriously by Western experts. During the speech, an outline of the current AI situation in China will be given. Concrete use cases will further illustrate the current status. Additionally, the following points will be discussed: How can German companies react? Where do the chances lie and what needs to be considered when doing business with Chinese partners?
This presentation offers a glimpse into the machine learning systems used by Daimler to support the diagnosis of problems in cars. The presented system is routinely trained on millions of past repairs and is in practical use in most Mercedes-Benz repair shops worldwide, serving tens of thousands of predictions every day. A specific focus of this presentation will be the lessons learned on the way from successful PoC to deployed system.
Every year, falls by the elderly cost more than €2bn in treatment in Germany alone – that’s 7 per cent of all insurance spending for that age group. Imagine if you could know the individual likelihood of why, when and where a person might fall? How many people and how much money could you save? We designed an integrative model combining proven psychological tests with an AI powered analysis of the gait to calculate the individual likelihood of a fall and to then provide tailored recommendations. Is elder care ready for data science?
Chatbots and virtual agents are hot topics of the hype surrounding the introduction of artificial intelligence technologies. Especially in large corporations, a true race for the fastest, but not always the best, implementation of chatbots for the customer but also for intra-corporate users has flared up. If this process is not coordinated, the result can be a true zoo in a wide variety of applications, which makes the already complex IT infrastructure even more complex and expensive. In addition, there is a risk that users will be confronted with a highly varying quality of chatbots, so that benefits and positive user experience are not guaranteed. In various projects on the subject of AI Blueprint Accenture together with various customers defined minimum requirements for standards for software and implementation and, for example, successfully implemented such standards with a reference bot.
In January 2019 the new IFRS16/US GAAP ASC 842 lease accounting regulations will come into eﬀect, bringing into review $2.8 trillion of assets on and oﬀ company balance sheets. There is one common thread throughout this reporting change: to successfully manage the lease accounting transition, companies need to gather a significant amount of data from their globally spread leases and evaluate them.LEVERTON’s AI information extraction technology is ideally equipped to support companies with these challenges. Our client Deutsche Post DHL facing the new IFRS16 lease accounting regulations will present how they leverage AI technology to simplify their transition. DPDHL’s database holds thousands of leases in numerous countries and several languages. The usually time-consuming and resource intensive task of sorting those lease documents to afterwards translate and aggregate the relevant information is now being solved by the AI based technology of LEVERTON. Learn how the software gathers all leases in one platform, automatically extracts the IFRS16 relevant information from their lease files and imports it into their existing ERP system.
The discussion about the use of artificial intelligence (AI) concerns the way machines, systems or processes can be optimized to improve different areas of our life and, ultimately, the quality of life. In the Health sector, for instance, this is reflected in providing support for diagnoses; the Services sector experiments with bots in answering customers’ questions while in banking there is the hope that in the future smart algorithms will satisfy customers' financial demand faster and more objectively. In the future, it will also be possible for human beings to empower their bodies by relying even further on AI. Already today we use technologies such as genetics, biotechnology and nanotechnology to make everyday life easier for people suffering from sickness or disabilities. For example, we use prostheses, heart valves and small pumps planted in the pancreas to provide relief for diabetics, or we install smart contact lenses which permanently measure blood sugar level. The societal implications of using such technologies to empower healthy people to become more powerful seems to have been ignored in the public debate. A question rises whether should be set regulatory limits or whether the market should eventually lead to a solution.
The competition for the leading market position and for setting new standards in the field of Artificial Intelligence (AI) is in full swing worldwide. An essential decision factor and key criterion for the competition is the access to data. In this context, the quantity and, in particular, the quality of the data is of special significance. The concept of AI raises concerns in regard to legal ethics and liability and presents the data economy with new challenges concerning the law of competition. It also raises the preliminary question of the (legal) assignment and exploitation of data as well as the question whether and to what extent the access to data and the use of data-processing-algorithms must be subject to legal control. The dominant position of intermediaries, service providers and data collectors on the digitalized market and the importance of questions regarding the access to data and the concept of interoperability therefore demand the examination of the (data-related) competition law.
Currently, performance assessments only use a small amount of recorded flight data, predominant in the form of aggregated indicators for situational evaluations. TU Darmstadt in partnership with Lufthansa Technik investigates the development of artificial intelligence tools to optimize aircraft performance analyses. Therewith, databased frameworks reproduce characteristics dependently upon routes, aircraft and environment in detail, for example, to model the fuel consumption. Machine learning methods allow significant improvements of analysis and prognosis accuracies compared to conventional procedures. As a result, the approach provides recommendations for the eco-strengthening of aircraft and contributes to a streamlined fuel planning management. The talk comprises model insights, results, best practices, and lessons learned from an application-oriented research for the implementation of an AI framework in aviation.
Big data and AI are massively trending. AI-methods that have created this hype, the so called deep neural networks, need billions to trillions of datasets to learn, e.g. in image recognition and automated translation. Such datasets are only available to data squids like Google, Facebook and Amazon. The majority of German companies, even big enterprises, do not have such amounts of data. To serve them with AI solutions, we have developed methods over the last four years that allow us to train AI-applications with small data sets. These procedures represent human knowledge in higher mathematical logic, transform text into mathematics in a sense-preserving fashion and can then be trained or work without training by comparing a given case with reference knowledge. This enables the usage of AI in almost every company. The applications are already in production with our customers. The talk describes a use case at SV Informatik: the automated validation of car glass repair bills.
• How does a classic machine building company such as the SMS Group become a digital enterprise?• What are currently the greatest potentials for the use of AI?• What challenges do we face in the operational use of artificial intelligence?
Artificial intelligence is becoming reality in many companies. Insurance companies automate their claims management, manufacturing companies forecast machine fallouts or service companies use AI to automate and customize customer interaction. In the future, these processes will become extremely important to business operations and the applications will have to work 24/7 without downtime. This makes AI applications an integral part of business-critical processes.Benefit from practical tips and tricks for strategy and implementation of AI and receive detailed insights from successful projects. Thus, SMS Digital is introducing its new AI product preQ. The new application is able to predict the failure of an extrusion press one hour ahead, with the help of artificial intelligence. This allows maintenance to react proactively and minimize downtime costs.
Artificial intelligence and machine learning transform Enterprise Search applications into a central, powerful knowledge database. According to Gartner, Enterprise Search transforms into an Insight Engine. AI-based applications, such as an Insight Engine, help to analyze and interpret large volumes of data, leveraging a wide range of deep learning, rule-based, linguistic, and semantic practices and models. Machine learning methods such as e.g. text classification help with the thematic classification of documents. The lecture will use the example of the product JURION, a digital law library and specialist information platform from Wolters Kluwer, to explain how content could be made more usable through intelligent content analysis.
Data analysis and machine learning are the key technologies when it comes to optimizing processes and products in the digital environment. Especially in the automotive industry, the digitalization has transformed the vehicle from a purely mechanical product into a highly complex, software-technical system. Increased digital complexity poses new challenges to quality assurance in automobile production. Based on vehicle and diagnostic data from vehicle manufacturing, methods for the early detection of anomalies and weak points can now be developed to make production processes more stable and efficient. This presentation will show the implementation of such a data mining project in automobile production. Both a classification model for the early detection of problems is created as well as a root-cause analysis is carried out.
In an increasing number of cases, medical experts discover roots of complex diseases, such as cancer, within the human genome. Therefore, analyses of the individual genetic code of each patient are the foundation of the innovative precision medicine. For example, the genetic profile of a tumor sample and the individual life style of a patient can provide insights into the efficiency of available chemotherapies. However, the acquisition of the genetic profile is very time-consuming, e.g. due to the high number of required process steps, the sheer amount of data, and the use of individual data formats. The “Analyze Genomes” cloud platform incorporates latest in-memory technology to enable new perspectives for precision medicine and digital health within clinical routine. For the first time, it enables instantaneous analysis of big medical data and their combination with global medical knowledge using latest machine learning and AI algorithms. As a result, medical experts are able to discover and assess available therapy options much faster to initiate the best choice from the very beginning.
The presentation will show how customers and exhibitors can be connected according to their preferences using modern AI services. The Benefits of such a digital marketplace 4.0 is explained using the intelligent business matching platform of Messe Frankfurt GmbH. “The right people, the right encounters, the right place - this is the motto of Messe Frankfurt. Modern AI services make it possible to create exactly the right encounters across all digital channels by automatically acquiring customer interest profiles and also automatically linking them to the offers of the manufacturers. Different sources must be integrated, digital profiles created and aligned with the offers in almost real-time. The result is an intelligent exhibition space realizing a new definition of customer and exhibitor experience.
Artificial intelligence and the technologies associated with it will no doubt raise the relationship between humans and machines to a new level, to a true partnership. In this context, the companion motif can be seen as the very incarnation of the user experience. Within the concept of companion technology, there are already very extensive definitions regarding the qualities a system must demonstrate in order to become a true companion to the human beings who use it. The interesting question now, however, is how the system then behaves. The benefits of a digital companion in the sense of a reminder or recommender system may be great in individual cases. But when it comes to the future of work – which will become increasingly knowledge-based – such functions fall short of ideal. From this perspective, how can a system not only mitigate any human shortcomings but also support and even initiate creative problem-solving?
Roman Lipski, a renowned painter from Berlin, gets inspired by his Artificial Muse, the first of its kind in the world of art. It is a new kind of symbiosis between man and machine, a dialogue that evolved over the last two years creating not just a new kind of art but step by step a new kind of artist. It is not Creative AI (as some call it), it is Inspirational AI (as we call it) and time to re-create and re-think the connections of human and artificial intelligence for the future. Not just in the arts, but everywhere. What if we could build an Artificial Muse for everyone? For our enterprises? What if we could redefine how we collaborate, communicate and co-create with intelligent services and devices? What if we could learn from and inspire each other? How about harmony instead of disruption and destruction? We took the first steps to find answers, started building an Artificial Enterprise Muse and now want to show and share our first findings.
Since 2017, Taylor Wessing develops its own "Legal Tech Framework" with the support of Empolis and is based on available AI technologies. This framework is designed to create training data and model information for Taylor Wessing's various legal areas and processes. Classifiers and extractors for the recognition of document types, topics, legal clauses and specific content are created iteratively to form the foundation on which solutions for specific use cases are then implemented: From the recognition and highlighting of clauses up to automated contract evaluation.In contrast to common market solutions, the data – which is the real value – remains completely in the hands of the company, which can also decide freely on the integration of desired functionalities.In the lecture, the speakers will give an insight into the possible applications of the framework and explain in particular, why they have decided to develop their own product despite a growing range of legal tech solutions on the market.
Sensorbased monitoring of health-related patientdata and the enrichment with contextinformation gathered for example by an app is forming the basis in current research projects in the fields obesity, epilepsy and parkinson at the Fraunhofer ISST. Goal is to holistically capture a particular pathology, to identify digital biomarkers, critical indicator or therapy process patterns and to derive required measures for the affected individual patient. Scope of these research approaches is how to gather and use contextinformation for decision support and how to provide information and recommendations for action for the patient so that the app as a permanent companion supports a holistic prevention- or therapy-support. At the same time legal and regulatory requirements on databased influence on the patient must be taken into consideration.
"Digital Excellence Ethics" takes on the trend of digitization in Big Data Analytics and AI / ML and goes along with it's speed. Using dynamic categories, this ethics continually develops new options for action and new concepts of communication in order to enable societal ideas and ethical judgments in the design processes themselves, as "Ethics on the Train"."Ethics on the Engine" goes one step further, pushing our values to where digital communication is de-veloping. If we want to ride with ethics on top of the train - on the engine - we have to become an intrinsic part of the design process itself (Schnebel / Szabo / Rusch 2017). Sitting on the engine means using ethics as a language: propose categories; introduce differentiations; and to reject them again; develop instruments to make ethical topics communicative and to integrate behavioural economics (Schnebel 2017).
For the assessment of security situations today a variety of relevant, highly up-to-date sources are available worldwide. For maximum benefit computing is required. Archiving and categorization can be automated using Big Data technologies. In addition to keyword-based methods we have used algorithms to detect latent topic relations. Utilizing the so-called Latent Dirichlet Allocation Algorithm (LDA) relationships between documents are uncovered. The method provides a meaningful and cost-effective classification of a large document corpus. For individual documents, this can be used to quantitatively indicate the relevance for specific topics (in the context in question threat categories), which considerably improves the results of the search and its sorted output. Over time trends and changes in individual types of threats can be deduced from the shift of certain word frequencies or the emergence of new terms.
The way of programming algorithms has not changed since the 1980ies. It causes a lot of efforts and long development cycles to describe and maintain complex algorithms. It is also extremely challenging to have them on small or old hardware systems. In addition, calculation speed and accuracy are challenging to implement. The small paceval. library offers a revolutionary solution to this. Most complex algorithms can be implemented and maintained fast and easy. paceval. runs on any development environment and hardware system. We explain how paceval. works and demonstrate how easy and fast most complex algorithms can be implemented on smallest processors. ‘Mathematics is the language in which God has described the universe’ Galileo Galilei (1564 – 1642)
For small and medium-sized enterprises an efficient handling of inquiries is decisive in order to stay competitive even with individual customer requests. On this purpose an intelligent assistance system for the support of employees was developed in cooperation with the company intrObest GmBH & Co KG. Based on customer requests, components are chosen from an internal database and pricing and delivery information are added, making it possible to create offers in a contemporary way. It takes over time-consuming tasks and supports the employees' decisions by making suggestions. Its flexible structure makes it easily customizable to new company scenarios and even without a high expenditure of resources it is possible to use technologies like machine learning that become accessible through digitalization.
It's time for AI to lose its glamour, its magic, at least the kind of AI we are mainly talking about - Machine Learning, Deep Learning & Co. "Software is eating the world, but AI is going to eat software", a quote from Jensen Huang, CEO of NVIDIA, needs to be buried as a credo and marketing bubble. Let's get to work. It's a better idea to talk about software 2.0 - as some people in the Valley are doing at the moment. But what happens really? What needs to be understood? What do we need to learn (and to unlearn)? To change and to incorporate in our organization? We want to share many ideas, give a decent overview, share core tatements and take a look into the future of software development. Together with you. 2018 AI will go mainstream. 2018 we will talk about production-ready development and 24/7 operation of AI systems and services. So - Software Development and Application Management, with or without a version number. With algorithms under the hood, at the interfaces, inside networks, driven by data and the phenomena represented in them. In 2018, we will not only need sexy data engineers, but cross-functional, agile teams, as well as redefined structures and development processes, tools, platforms and frameworks, which will enable us to seamlessly link the different paradigms, skills and capabilities - technically, organizationally, culturally.
From ancient Egyptian stone carvings to electronic messages on Facebook timelines – text has always been a main driver of human culture. What is different now? It has never been easier to create text than today. This is why the amount of text is constantly growing - also due to user-generated content (UGC): in company wikis, digitalized customer service calls, user groups, forum posts, tweets, facebook posts, product reviews, .... Users write what they think. While this can sometimes feel like losing control over your own reputation, it can be managed and even turned into a treasurable source for innovation, ideas and service improvements. And this information comes directly from customers! This talk sums up the results of a study on how unsupervised and supervised machine learning can be used to get insights from user-generated content. We find the most exciting topics which were previously hidden, create clusters from user profiles with engagement and sentiment. We follow the dynamics of interacting user groups (e.g. influencers vs. trolls). Network graphs show the central nodes in our analysis.
The aim of Aurebu is to recognize invoices by means of machine learning processes and to integrate the results into an automatic accounting booking process for small tour operators. The steps are divided into: 1. Business transaction identification and 2. Accounting entry.
From a business point of view, the manual processing of the processes by a trained accountant should be significantly streamlined, ideally abolished as possible. For the prototype the following components were developed, which in turn themselves consist of more than 40 subcomponents: • Document analysis • Machine Learning / AI • Integration of branch logic • Optimization of the workflow
During the learning phase of the AI components, a learning data volume of 300,000 documents was used. Rule sets have been defined from the perspective of travelbasys, which could be used in such a way that a dynamic rule development could be established in a self-learning system. It is important to mention that at some point the AI system entered into the self-learning mode to develop a self-adaptive control system. At the end of the training phase, mass comparisons of the booking records were made from the incoming invoices between the Aurebu AI and the human input booking records. Here, on average, a matching rate of> 85% of the Aurebu-KI was achieved.
Since IoT many sensor data are read and evaluated. Most sensors have only one specific purpose. Cameras, on the other hand, are versatile and could replace or supplement many sensors. However, in contrast to specific sensor data, images are unstructured and difficult to evaluate. What is the state of the art for analyzing this unstructured image data? Is the current technology suitable for industrial use or is it only usable by research departments? What are the challenges in practical use? What does a practical deployment workflow for image recognition look like?
Promising forecasts for the use of artificial intelligence in medical image analysis are based on various developments: The pressure on radiologists is growing due to the rising use of imaging methods with increased complexity. On the other hand, health care costs are on the rise due to various factors. Process optimization in medicine is therefore necessary to counteract this development. In particular, images from magnetic resonance imaging (MRI) and computer tomography (CT) can be diagnosed faster and more reliably by computer intelligence than by radiologists alone. In order to achieve a high level of acceptance for these new solutions among physicians, it is necessary to focus on the process chain and offer the physician additional benefits. Using the example of the assisted diagnosis of prostate carcinoma with Deep Learning procedures, we want to demonstrate such a procedure and also briefly address the aspects of data protection and possible business models.
The inspection of technical installations is an indispensable prerequisite for maintaining the functionality of these systems. The aim of the inspection is to maintain the normal condition.In the field of production, automatic optical inspection (AOI) has been state of the art. However, the inspection of technical structures (masts, bridge constructions, etc.) or outdoor facilities is still carried out conventionally by inspection with visibility or ascent. These are labour-intensive, time-consuming and costly.With this background, the idea of planning and implementing visual inspections with autonomous systems, such as unmanned aircraft systems, was born. These are more time-efficient and can deliver results with less staff. The data is comparable or even better than conventional inspection. The article deals with this topic using the example of visual inspection of overhead line poles. First, solutions with regard to accuracy and precision are presented (sensor fusion). Subsequently, the sensory generated data volume (big data, classification) and the based on it fully automatic data analysis (deep learning) are discussed. Finally, the economic potential of the developed solution approach is presented.
Customer-oriented Digital Analytics & Optimization has long since become an integral part of any digitization strategy. Employees and decision-makers at all levels of the hierarchy rely on established analytics methods to assess and optimize their target achievement. Strategy, Culture & People, Organization, Digitization, Technology and Data are the six dimensions of the Maturity Model developed by the Digital Analytics & Optimization working group in Bitkom organization together with recognized experts. The focus of the investigation is the digital interaction with the consumer. A representative survey of around 1,000 companies sheds light on the status of the German economy. Gain a deep insight into the methodical approach! Learn more about how you can determine the maturity of your business! Take valuable recommendations for the achievement of higher maturity levels!
For online clothing retailers, having a user-friendly product search capability is one of the most important factors of success. But for this to work, retailers must have a way to assign metadata for things like brands, colors, and cuts to every item in their database. In most cases, this data must be entered manually at great expense. This approach is very time consuming and prone to errors.This presentation will discuss a promising alternative used by ANWR Group, operator of the popular online store “schuhe.de”. Working in collaboration with Big Data experts at ORAYLIS, the company developed an image recognition technology that analyzes product photos to automatically assign keywords to new products. It uses a pre-trained model that can be adapted to individual requirements. This eliminates a substantial portion of training expenses while simultaneously providing a very high-performance model that is tailored to the needs of a shoe retailer.
With encompassing digitization, decision-makers have a multitude of decision-relevant information at their disposal. While structured data is already used to automate decision making, the computational analysis of unstructured data is still a key challenge. In this talk, we show how Argument Mining can be used to unleash the potential of unstructured data and to integrate unused assets into decision-making processes. With the advances of Text Mining and Deep Learning, it is now possible to search large document collections, like social media or news reports, for arguments relevant to a given topic. This opens up new search paradigms, which we validate together with Daimler AG for the systematic assessment of novel innovations in the field of mobility and transport.
This project of Hof University’s research center for car infotainment and man-machine-interfaces develops a user-support system for complex production machines produced by Hans Weber Maschinenfabrik in Kronach. It assists the operator by displaying process-situation-specific recommendations. The knowledge base for the recommendations is generated automatically from logged data on former user-interaction. An error message for example will be displayed with a recommendation like “in 90% of all occurences of this error the operators performed the following keystrokes”. The recommender software is mostly machine independent and can therefore be refinanced over several machine types running in small lots.
At mobile.de, Germany’s biggest car marketplace, a dedicated data team, supported by the IT project house inovex, is responsible for creating smart data products. One focus are personalised vehicle recommendations to improve the customer experience during browsing as well as finding the perfect offering. As an introduction, we briefly mention the traditional approaches for recommendation engines, thereby motivating the need for more sophisticated approaches. We then illustrate how Deep Learning can be leveraged to capture the underlying non-linear correlations of features for personalised recommendations. In particular, we’ve customised Google Play’s algorithm for an online marketplace with a fast-changing inventory. Several variants of our adapted approach are evaluated against traditional methods as well as scalability aspects are addressed. We conclude our talk by giving an outlook on the importance of personalised user experiences and the application of Deep Learning and AI at mobile.de.
Artificial Intelligence, Machine Learning and the Internet of Things will soon change and revolutionize many areas of life. However, the development and use of artificial intelligence also entails several (legal) requirements for companies: With the EU General Data Protection Regulation (GDPR), which will become effective in May 2018, data protection law is facing a reform that presents companies with enormous challenges. In particular the GDPR strengthens the rights of data subjects, i.e. natural persons whose personal data are processed. The topics discussed include the right to be forgotten, the Data Protection Impact Assessment (DPIA) and Privacy-by-Design/Default. In his presentation Dennis Kurpierz explains the new requirements of the GDPR and raises companies' awareness for the use of Artificial Intelligence in terms of data protection law.
Tap the pin to start your mobile navigation.