Refine
Year of publication
Document Type
- Article (857)
- Conference Proceeding (749)
- Lecture (175)
- Part of a Book (147)
- Book (121)
- Report (60)
- Other (51)
- Master's Thesis (35)
- Bachelor Thesis (24)
- Periodical (11)
Keywords
- Deutschland (11)
- Compositing (9)
- Digitalkamera (9)
- Farbbildverarbeitung (9)
- HDTV (9)
- Kameraarbeit (9)
- Videobearbeitung (9)
- cybersecurity (9)
- Computersicherheit (8)
- Dreidimensionale Video (8)
Institute
Die Anzahl von mit Hilfe der Internettechnologie im Web zur Verfügung gestellten offenen API (Open APIs) zeigt ein exponentielles Wachstum. Auf der Grundlage kompositorisch zusammengesetzter APIs entstehen innovative Lösungen, die sich zunehmend als Treiber der Digitalisierung positionieren. Vor diesem Hintergrund gilt es, Open APIs aus der Perspektive der fachlichen Nutzung zu gestalten, so dass diese in möglichst vielfältigen Anwendungsszenarien stabil und nachhaltig verwendet werden können. In diesem Zusammenhang wird auch von einer „API first“-Strategie gesprochen. Die erste Session des Workshops widmet sich dieser Themenstellung, wobei u.a. auf Fragen der API-Gestaltung mit Hilfe einer OpenAPI bzw. Swagger Spezifikation eingegangen wird. Mit Aspekten des entwicklungsseitig und betrieblich benötigten API-Managements beschäftigt sich dann die zweite Session. Dabei soll sowohl auf Aspekte der Identifizierung und Authentifizierung konkreter API-Zugriffe als auch auf die Themenstellung einer DevOps-orientierten API-Bereitstellung eingegangen werden. Innovative Techniken, welche im Diskurs einer kompositorischen Softwareentwicklung verwendet werden können, stehen im Mittelpunkt der letzten Session. Dabei geht es zunächst um die durch Facebook propagierte Abfragesprache GraphQL, welche die Einbindung von httpbasierten APIs vereinfachen soll. Ein weiterer Beitrag beschreibt die Entwicklung einer mobil einsetzbaren Serviceapplikation, die umfangreichen Gebrauch von Open APIs macht. Dabei wird auf lokal installierte Entwicklungsumgebungen weitgehend verzichtet, d.h. auch diese findet Cloud-basiert statt. Der thematische Hintergrund dieses Workshops entstammt dem im letzten Jahr in Berlin durchgeführten ESAPI-Workshop (speziell dem World Café). Die Ergebnisse der dort diskutierten Fragen sind im letzten Beitrag dokumentiert.
Since its introduction in 2014, the face morphing forgery (FMF) attack has received significant attention from the biometric and media forensic research communities. The attack aims at creating artificially weakened templates which can be successfully matched against multiple persons. If successful, the attack has an immense impact on many biometric authentication scenarios including the application of electronic machine-readable travel document (eMRTD) at automated border control gates. We extend the StirTrace framework for benchmarking FMF attacks by adding five issues: a novel three-fold definition for the quality of morphed images, a novel FMF realisation (combined morphing), a post-processing operation to simulate the digital image format used in eMRTD (passport scaling 15 kB), an automated face recognition system (VGG face descriptor) as additional means for biometric quality assessment and two feature spaces for FMF detection (keypoint features and fusion of keypoint and Benford features) as additional means for forensic quality assessment. We show that the impact of StirTrace post-processing operations on the biometric quality of morphed face images is negligible except for two noise operators and passport scaling 15 kB, the impact on the forensic quality depends on the type of post-processing, and the new FMF realisation outperforms the previously considered ones.
Reducing the False Alarm Rate for Face Morph Detection by a Morph Pipeline Footprint Detector
(2018)
This paper introduces an approach to automatic generation of visually faultless facial morphs along with a proposal on how such morphs can be automatically detected. It is endeavored that the created morphs cannot be recognized as such with the naked eye and a reference automatic face recognition (AFR) system produces high similarity scores while matching a morph against faces of persons who participated in morphing. Automatic generation of morphs allows for creating abundant experimental data, which is essential (i) for evaluating the performance of AFR systems to reject morphs and (ii) for training forensic systems to detect morphs. Our first experiment shows that human performance to distinguish between morphed and genuine face images is close to random guessing. In our second experiment, the reference AFR system has verified 11.78% of morphs against any of genuine images at the decision threshold of 1% false acceptance rate. These results indicate that facial morphing is a serious threat to access control systems aided by AFR and establish the need for morph detection approaches. Our third experiment shows that the distribution of Benford features extracted from quantized DCT coefficients of JPEG-compressed morphs is substantially different from that of genuine images enabling the automatic detection of morphs.
We analyze StirTrace towards benchmarking face morphing forgeries and extending it by additional scaling functions for the face biometrics scenario. We benchmark a Benford's law based multi-compression-anomaly detection approach and acceptance rates of morphs for a face matcher to determine the impact of the processing on the quality of the forgeries. We use 2 different approaches for automatically creating 3940 images of morphed faces. Based on this data set, 86614 images are created using StirTrace. A manual selection of 183 high quality morphs is used to derive tendencies based on the subjective forgery quality. Our results show that the anomaly detection seems to be able to detect anomalies in the morphing regions, the multi-compression-anomaly detection performance after the processing can be differentiated into good (e.g. cropping), partially critical (e.g. rotation) and critical results (e.g. additive noise). The influence of the processing on the biometric matcher is marginal.
In 2014 a novel identity theft scheme targeting specific application scenarios in face biometrics was introduced. In this scheme, a so called face morph melts two or more face images of different persons into one image, which is visually similar to multiple real world persons. Based on this non authentic image, it is possible to apply for an image based identity document to be issued by a corresponding authority. Thus, multiple persons can use such a document to pass image based person verification scenarios with a single document containing an artificially weakened template. Currently there is no reliable existing security mechanism to detect this attack.
The paper presents an observation and evaluation of the recovery process. Seven weeks after the fracture of the fibula and six weeks after surgery the measurement of the gait of the subject has been started. Inertial measurement units applied above both ankles are used to acquire kinematical data and three surface EMG at each leg to get information about the activity of three muscles. The intensity of the weekly tests has been increased in dependence on the healing progress. Both scenarios-walking on the floor and on the treadmill-were included in the experiments to obtain comparable data sets. Beside of the investigation of the healing process the paper is focused on symmetry/asymmetry properties of the gait. The results indicate that asymmetry is not only an essential property of pathologic gait but it is present in normal gait too, especially if short distances are evaluated. They may be caused by the influence of a dominant/injured leg, of the first/last steps, the natural variance of the gait or the daily fitness of the subject. Asymmetry is obviously in the contribution of both legs to a stride (step length, strike and lift angle, stand and swing phase).
Unsere Welt verändert sich in unterschiedlichen Tempi und Verwerfungstiefen. Dies ist das einzig Verlässliche für Wirtschaft und Gesellschaft.
Zum einen zählt die kreative Zerstörung zum ‚Markenkern‘ marktwirtschaftlicher Systeme. Die Wirtschaftsakteure stehen dadurch an ihrem jeweiligen Standort unter einer Art (produktivem) Dauerstress. Es gibt Gewinner und Verlierer, weil Wettbewerber unterschiedlich erfolgreich kostensenkende Rationalisierungen, technischen Fortschritt und die Erschließung neuer (globaler) Märkte realisieren. Der damit einhergehende Strukturwandel gilt als wirtschaftssystemischer Normalzustand.
Zum anderen gibt es Krisen. Und auch ‚unter Schock‘ (re-)agieren Akteure ungleich, wie die Corona-Pandemie oder abrupte Veränderungen durch die Finanzkrise zeigen. Einige Akteure und Standorte bleiben mit ihrem jeweiligen Profil und Potenzial (weiterhin) produktiv widerstandsfähig. Für andere verschärfen sich der wirtschaftliche Niedergang und dessen Folgen für die Arbeitsmärkte und Lebensqualität der Betroffenen.
Regionale Wirtschaftsförderung unterstützt, dass sich die Wirtschaftsstrukturen an einem Standort überlebensfähig ausrichten. Doch kann sie auch beim Umgang mit Krisen einen Mehrwert erzielen und zählt dies überhaupt zu ihrem Regelauftrag? Und welche Rolle spielt die Digitalisierung? Der Beitrag stellt den sachlogischen Zusammenhang zwischen diesen Teilfragen her und ordnet neuere Ansätze ein. Zudem werden Ansatzstellen aufgezeigt, um die regionale Wirtschaftsförderung praxisorientiert weiterzuentwickeln. ‚Electronic Government‘ (E-Government) beschreibt dabei den Einsatz von Informationstechnik, Telekommunikation und Medien (ITKM) im öffentlichen Sektor, während ‚Electronic Governance‘ (E-Governance) hybride oder private Akteure miteinbezieht.
Der Beitrag ist wie folgt strukturiert: Zunächst werden Schlüsselbegriffe wie Strukturpolitik und Resilienz geklärt und mit den Kontextfaktoren verknüpft, welche die Steuerungslogik von Wirtschaftsförderung bestimmen. Danach wird diskutiert, inwiefern die Wirtschaftsförderung E-Government und E-Governance nutzen kann, um Resilienz zu stärken und möglichst transformativ wirken zu können.
The public sector faces several challenges, such as a number of external and internal demands for change, citizens' dissatisfaction and frustration with public sector organizations, that need to be addressed. An alternative to the traditional top-down development of public services is co-creation of public services. Co-creation promotes collaboration between stakeholders with the aim to create better public services and achieve public values. At the same time, data analytics has been fuelled by the availability of immense amounts of textual data. Whilst both co-creation and TA have been used in the private sector, we study existing works on the application of Text Analytics (TA) techniques on text data to support public service co-creation. We systematically review 75 of the 979 papers that focus directly or indirectly on the application of TA in the context of public service development. In our review, we analyze the TA techniques, the public service they support, public value outcomes, and the co-creation phase they are used in. Our findings indicate that the TA implementation for co-creation is still in its early stages and thus still limited. Our research framework promotes the concept and stimulates the strengthening of the role of Text Analytics techniques to support public sector organisations and their use of co-creation process. From policy-makers' and public administration managers' standpoints, our findings and the proposed research framework can be used as a guideline in developing a strategy for the designing co-created and user-centred public services.
Limitierende Faktoren
(2023)
Wie bei allen Risiken gibt es auch bei Compliance-Risiken kein Null-Risiko. Man wird in der Praxis noch nicht einmal in die unmittelbare Nähe einer Null kommen. Ein Null-Compliance-Risiko ist auch deshalb nicht erreichbar, weil sich ständig neue Risikofelder auftun. So beispielsweise durch Änderung von Geschäftsmodellen oder Vertriebswegen oder durch neue rechtliche Rahmenbedingen.
Disclinations or disclination clusters in smectic C freely suspended films with topological charges larger than one are unstable. They disintegrate, preferably in a spatially symmetric fashion, into single defects with individual charges of +1, which is the smallest positive topological charge allowed in polar vector fields. While the opposite process of defect annihilation is well-defined by the initial defect positions, disintegration starts from a singular state and the following scenario including the emerging regular defect patterns must be selected by specific mechanisms. We analyze experimental data and compare them with a simple model where the defect clusters adiabatically pass quasi-equilibrium solutions in one-constant approximation. It is found that the defects arrange in geometrical patterns that correspond very closely to superimposed singular defect solutions, without additional director distortions. The patterns expand by affine transformations where all distances between individual defects scale with the same time-dependent scaling factor proportional to the square-root of time.
We present a method for the arbitrage-free interpolation of plain-vanilla option prices and implied volatilities, which is based on a system of integral equations that relates terminal density and option prices. Using a discretization of the terminal density, we write these integral equations as a system of linear equations. We show that the kernel matrix of this system is, in general, ill-conditioned, so that it cannot be solved for the discretized density using a naive approach. Instead, we construct a sparse model for the kernel matrix using singular value decomposition (SVD), which allows us not only to systematically improve the condition number of the kernel matrix, but also determines the computational effort and accuracy of our method. In order to allow for the treatment of realistic inputs that may contain arbitrage, we reformulate the system of linear equations as an optimization problem, in which the SVD-transformed density minimizes the error between the input prices and the arbitrage-free prices generated by our method. To further stabilize the method in the presence of noisy input prices or arbitrage, we apply an 𝐿1-regularization to the SVD-transformed density. Our approach, which is inspired by recent progress in theoretical physics, offers a flexible and efficient framework for the arbitrage-free interpolation of plain-vanilla option prices and implied volatilities, without the need to explicitly specify a stochastic process, expansion basis functions or any other kind of model. We demonstrate the capabilities of our method in a number of artificial and realistic test cases.
The Patellostabilometer: A New Device for Quantification of Mediolateral Patella Displacement
(2023)
Mediolateral patella displacement is of interest for diagnostics and clinically relevant
research questions. Apart from manual testing, no standardized method is currently available. Proper
quantification of patella mobility is necessary to better understand pathologies at the patellofemoral
joint. Patella mobility was assessed in 25 healthy individuals using a Patellostabilometer, a new
prototype instrument for quantification of the mediolateral patella displacement. The participants
underwent measurements of the mediolateral displacement three times using the Patellostabilome-
ter. A maximal force of 10 N was applied for patella movement. Additionally, leg length and
circumference of the knee, upper- and lower-leg were measured. Lateral patella displacement of
18.27 ± 3.76 mm (range 15.85–20.64 mm, interquartile range (IQR) of 4.79) was measured. The medial
patella displacement showed 24.47 ± 6.59 mm (range 19.29–29.76 mm, IQR of 10.47). The test–retest
measurement error was 2.32 ± 1.76 mm (IQR of 2.38 mm), with five outliers. There was greater
test–retest variability between the measurements of the medial displacement compared to the lateral
one. The test–retest variability reached 7% of the patella displacement. Other parameters provided no
significant correlations. Based on the natural patellofemoral mobility, a precise and clinically relevant
quantification of patella mobility is allowed.
Context awareness is critical for the successful execution of processes. In the abundance of business process management (BPM) research, frameworks exclusively devoted to extracting context from textual process data are scarce. With the deluge of textual data and its increasing value for organizations, it becomes essential to employ relevant text analytics techniques to increase the awareness of process workers, which is important for process execution. The present paper addresses this demand by developing a framework for context awareness based on process executions-related textual data using a well-established layered BPM context model. This framework combines and maps various text analytics techniques to the layers of the context model, aiming to increase the context awareness of process workers and facilitate informed decision-making. The framework is applied in an IT ticket processing case study. The findings show that contextual information obtained using our framework enriches the awareness of process workers regarding the process instance urgency, complexity, and upcoming tasks and assists in making decisions in terms of these aspects.
Compliance-Management
(2023)
Dieses Buch bietet eine komprimierte und praxisorientierte Einführung in die Grundlagen und Anforderungen eines Compliance-Managements, um den Leser schnell mit aktuellen und zentralen Regelungen vertraut zu machen. Dabei wird auf gesetzliche und regulatorische Bestimmungen ebenso eingegangen wie auf betriebswirtschaftliche Notwendigkeiten, um Mitarbeitende und Unternehmen vor den schwerwiegenden Folgen von Compliance-Verstößen zu schützen.
An approach for analyzing business process execution complexity based on textual data and event log
(2023)
With the advent of digital transformation, organizations increasingly rely on various information systems to support their business processes (BPs). Recorded data, including textual data and event log, expand exponentially, complicating decision-making and posing new challenges for BP complexity analysis in Business Process Management (BPM). Herein, Process Mining (PM) serves to derive insights based on historic BP execution data, called event log. However, in PM, textual data is often neglected or limited to BP descriptions. Therefore, in this study, we propose a novel approach for analyzing BP execution complexity by combining textual data serving as an input at the BP start and event log. The approach is aimed at studying the connection between complexities obtained from these two data types. For textual data-based complexity, the approach employs a set of linguistic features. In our previous work, we have explored the design of linguistic features favorable for BP execution complexity prediction. Accordingly, we adapt and incorporate them into the proposed approach. Using these features, various machine learning techniques are applied to predict textual data-based complexity. Moreover, in this prediction, we show the adequacy of our linguistic features, which outperformed the linguistic features of a widely-used text analysis technique. To calculate event log-based complexity, the event log and relevant complexity metrics are used. Afterward, a correlation analysis of two complexities and an analysis of the significant differences in correlations are performed. The results serve to derive recommendations and insights for BP improvement. We apply the approach in the IT ticket handling process of the IT department of an academic institution. Our findings show that the suggested approach enables a comprehensive identification of BP redesign and improvement opportunities.
Layered van der Waals ferromagnets, which preserve their magnetic properties down to exfoliated monolayers, are fueling an abundance of fundamental research and nanoscale device demonstration. CrGeTe3 is a prime example of this class of materials. Its temperature-pressure phase diagram features an insulator-to-metal transition and a significant increase in ferromagnetic Curie-Weiss temperatures upon entering the metallic state. We use density functional theory to understand the magnetic exchange interactions in CrGeTe3 at ambient and elevated pressures. We calculate Heisenberg exchange couplings, which provide the correct ferromagnetic ground state and explain the experimentally observed pressure dependence of magnetism in CrGeTe3. Furthermore, we combine density functional theory with dynamical mean-field theory to investigate the effects of electronic correlations and the nature of the high-pressure metallic state in CrGeTe3.