Detailseite
Projekt Druckansicht

Skalierende präzise systemweite abstraktions- und maschinenübergreifende datengetriebene Nutzungskontrolle

Fachliche Zuordnung Softwaretechnik und Programmiersprachen
Förderung Förderung von 2010 bis 2016
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 183688753
 
Usage control requirements stipulate constraints on the usage of data after access to them (delete within three days, do not copy.) If usage control requirements are to be enforced, then one must take into account that data exists in multiple representations. This requires tracking data flows from one data representation to another, within and across different layers of abstraction. During the first phase of the priority program, we have developed the fundamental theory and technology for data-driven usage control across layers of abstraction, an extension of usage control by data flow detection concepts. With this framework, one can enforce usage control requirements for all different representations of a data item at the same time. For example, if a policy stipulates that a data item be deleted, then our system deletes all copies, or representations, of that object as well. During the ongoing second phase of the project, we have increased the precision of the approach by building an enhanced data flow model for multiple interconnected layers of abstraction, and by exploiting the structure as well as quantity of data. In the third phase of the project, we plan to increase the scalability of the approach both within one and across multiple machines, thereby turning the developed theory and technology into a viable approach for security-in-the large. We plan to tackle the problem from two angles of attack:** Security in the large by compositionality of distributed decisions: We want to study how adherence to global usage control policies can be evaluated without one central policy decision point that needs to be aware of all events in the system. ** Scalability: Experience confirms the intuition that the scalability of the approach depends on the granularity of the usage events being monitored. While it seems unlikely to improve the known asymptotic complexity-theoretic bounds, we want to perform systematic analyses to study the tradeoff between the granularity of data tainting, the precision of the approach, and its scalability, and thus gain insights into possible optimizations.
DFG-Verfahren Schwerpunktprogramme
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung