direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Berlin Big Data Centre (BBDC)

The Berlin Big Data Centre (BBDC) develops highly innovative technologies to organize the vast amounts of data and to derive informed decisions from these in order to create economic and social value. This is achieved through the merger of the previously isolated disciplines of data management and machine learning. The technologies of the center reduce the cost of analysis of Big Data, increase the group of people who can perform large scale data analysis and expand the leading position of Germany in this field in science and industry. The focus is on three exemplary economically, scientifically and socially relevant application areas: materials science, medicine, and information marketplaces. Based on worldwide recognized leading-edge research we want to enable automatic optimization, parallelization, and scalable and adaptive processing algorithms. This covers work in the areas of machine learning, linear algebra, statistics, probability theory, computer linguistics and signal processing.

In order to optimally prepare industry, science and the society in Germany and Europe for the global Big Data trend, highly coordinated activities in research, teaching, and technology transfer regarding the integration of data analysis methods and scalable data processing are required. To achieve this, the Berlin Big Data Center is pursuing the following seven objectives:

  1. Pooling expertise in scalable data management, data analytics, and big data application
  2. Conducting fundamental research to develop novel and automatically scalable technologies capable of performing “Deep Analysis” of “Big Data”.
  3. Developing an integrated, declarative, highly scalable open-source system that enables the specification, automatic optimization, parallelization and hardware adaptation, and fault-tolerant, efficient execution of advanced data analysis problems, using varying methods (e.g., drawn from machine learning, linear algebra, statistics and probability theory, computational linguistics, or signal processing), leveraging our work on Apache Flink
  4. Transfering technology and know-how to support innovation in companies and startups.
  5. Educating data scientists with respect to the five big data dimensions (i.e., applications, economic, legal, social, and technological) via leading educational programs.
  6. Empowering people to leverage “Smart Data”, i.e., to discover newfound information based on their massive data sets.
  7. Enabling the general public to conduct sound data-driven decision-making.

Consortium

TU Berlin:

  • Database Systems and Information Management (DIMA)
  • Machine Learning (ML)
  • Internet Network Architectures (INET)
  • Complex and Distributed IT Systems (CIT)
  • Image Communication (IC)

DFKI

  • Language Technology Lab (LT)
  • Intelligent Analysis of Mass Data Lab (IAM)

Zuzue Insitute Berlin

  • Distributed Algorithms and Supercomputing (DAS)
  • Mathematics for Life and Materials Sciences (MfLMS)

Beuth Hochschule

  • Data Science Lab (DSL)

Fritz Haber Institute of the Max Planck Society

  • Theory Department (MPI/FHI)

Further project information

Lupe
  • Funding: Federal Ministry of Education and Research
  • Funding mark: 01IS14013A
  • Website: www.bbdc.berlin

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe

Auxiliary Functions

Contact

Thomas Renner
+49 (30) 314-79675
E-N 101