Gold Open Access- Erstveröffentlichung in einem/als Open-Access-Medium
Refine
Year of publication
Document Type
- Article (284)
- Part of a Book (52)
- conference proceeding (article) (45)
- conference proceeding (volume) (7)
- conference proceeding (presentation, abstract) (6)
- Edited collection (3)
- Book (2)
Is part of the Bibliography
- no (399)
Keywords
- Digitalisierung (8)
- Humans (7)
- Schlaganfall (7)
- magnetoactive elastomer (7)
- Power-to-Gas (6)
- Pflege (5)
- Simulation (5)
- COVID-19 (4)
- additive manufacturing (4)
- magnetoelectric effect (4)
Institute
- Fakultät Informatik und Mathematik (121)
- Fakultät Maschinenbau (93)
- Fakultät Sozial- und Gesundheitswissenschaften (71)
- Fakultät Elektro- und Informationstechnik (57)
- Research Center of Health Sciences and Technology - RCHST (41)
- Research Center of Biomedical Engineering - RCBE (38)
- Fakultät Angewandte Natur- und Kulturwissenschaften (25)
- Labor Intelligente Materialien und Strukturen (25)
- Institut für Sozialforschung und Technikfolgenabschätzung (IST) (21)
- Fakultät Business and Management (20)
Begutachtungsstatus
- peer-reviewed (292)
- begutachtet (3)
Automated deep learning based detection of cellular deposits on clinically used ECMO membrane lungs
(2026)
Introduction:
Despite the promising application of extracorporeal membrane oxygenation (ECMO) in the treatment of critically ill patients, coagulation-associated technical complications, primarily clot formation and critical bleeding, remain a major challenge during ECMO therapy. The deposition of nucleated cells on the surface has been shown, yet the role of these cells towards complication development is still matter of ongoing research. In particular, the membrane lung (MemL) is prone to clot formation. Therefore, the investigation of nuclear deposits on its hollow-fibers may provide insights for a better understanding of the cellular mechanisms involved in the development of ECMO complications.
Methods:
To support current research, this study aimed to develop a deep learning–based tool for the automated detection and quantitative analysis of nuclear depositions on MemL hollow-fiber mats. A customized fluorescence microscopy workflow, combined with a semi-automated iterative labeling strategy, was used to generate a high-quality dataset for model training.
Results:
Six configurations of instance segmentation models were evaluated, with a Mask R-CNN with ResNet 101 backbone using dilated convolution providing the most balanced performance in both nuclei count and area accuracy. Compared with U-Net–based approaches such as Cellpose or StarDist, the proposed model demonstrated superior segmentation of overlapping and low-intensity nuclei, maintaining accuracy even in densely packed cellular regions.
Discussion:
We present an automated image analysis tool for clinically used MemLs, which exhibit complex three-dimensional hollow-fiber architectures and irregular cellular deposits that challenge conventional tools. A dedicated graphical user interface enables streamlined detection, morphometric analysis, and spatial clustering of nuclei, establishing a reproducible workflow for high-throughput analysis of fluorescence microscopy images. This approach eliminates labor-intensive manual counting and facilitates large-scale studies on cell-fiber interactions and disease-related correlations.
This paper presents a scalable machine learning pipeline for extracting actionable, product-related insights from user-generated social media comments. Leveraging sentence embeddings from SBERT and unsupervised clustering (k-Means and agglomerative), the approach structures informal and noisy comments from Instagram and YouTube into topic groups intended to support thematic analysis. A case study on feedback regarding BMW vehicles, comprising more than 26,000 comments, illustrates how the pipeline can reveal recurring user concerns, such as design critiques, usability issues, and technology-related expectations, even in short and unstructured social media comments. The proposed pipeline operates without labeled data or manual annotation, enabling scalable application and transferability across product categories and industries. By transforming large-scale, unstructured consumer feedback into interpretable themes, the pipeline provides product teams with an efficient and structured basis for data-driven product development and improvement.
Noisy Intermediate-Scale Quantum (NISQ) computers, despite their limitations, present opportunities for near-term quantum advantages in Nuclear and High-Energy Physics (NHEP) when paired with specially designed quantum algorithms and processing units. This study focuses on core algorithms that solve optimization problems through the quadratic Ising or Quadratic Unconstrained Binary Optimisation model, specifically Quantum Annealing and the Quantum Approximate Optimisation Algorithm (QAOA).
In particular, we estimate runtimes and scalability for the task of particle Track Reconstruction (TR), a key computing challenge in NHEP, and investigate how the classical parameter space in QAOA, along with techniques like a Fourieranalysis based heuristic, can facilitate future quantum advantages. The findings indicate that lower frequency components in the parameter space are crucial for effective annealing schedules, suggesting that heuristics can improve resource efficiency while achieving near-optimal results. Overall, the study highlights the potential of NISQ computers in NHEP and the significance of co-design approaches and heuristic techniques in overcoming challenges in quantum algorithms.
Many of the envisioned use-cases for quantum computers involve optimisation processes. While there are many algorithmic primitives to perform the required calculations, all eventually lead to quantum gates operating on quantum bits, with an order as determined by the structure of the objective function and the properties of target hardware. When the structure of the problem representation is not aligned with structure and boundary conditions of the executing hardware, various overheads degrading the computation may arise, possibly negating any possible quantum advantage.
Therefore, automatic transformations of problem representations play an important role in quantum computing when descriptions (semi-)targeted at humans must be cast into forms that can be “executed” on quantum computers. Mathematically equivalent formulations are known to result in substantially different non-functional properties depending on hardware, algorithm and detail properties of the problem. Given the current state of noisy intermediate-scale quantum (NISQ) hardware, these effects are considerably more pronounced than in classical computing. Likewise, efficiency of the transformation itself is relevant because possible quantum advantage may easily be eradicated by the overhead of transforming between representations. In this paper, we consider a specific class of higher-level representations, that is, PUBOs, and devise novel automatic transformation mechanisms into widely used QUBOs that substantially improve efficiency and versatility over the state of the art. In addition, we conduct a comprehensive investigation of industry-relevant problem formulations and their conversion into a quantum-specific representation, identifying significant obstacles in scaling behaviour and demonstrating how these can be circumvented.
Finding optimal join orders is among the most crucial steps to be performed by query optimisers. Though extensively studied in data management research, the problem remains far from solved: While query optimisers rely on exhaustive search methods to determine ideal solutions for small problems, such methods reach their limits once queries grow in size. Yet, large queries become increasingly common in real-world scenarios, and require suitable methods to generate efficient execution plans. While a variety of heuristics have been proposed for large-scale query optimisation, they suffer from degrading solution quality as queries grow in size, or feature highly sub-optimal worst-case behavior, as we will show.
We propose a novel method based on the paradigm of mixed integer linear programming (MILP): By deriving a novel MILP model capable of optimising arbitrary bushy tree structures, we address the limitations of existing MILP methods for join ordering, and can rely on highly optimised MILP solvers to derive efficient tree structures that elude competing methods. To ensure optimisation efficiency, we embed our MILP method into a hybrid framework, which applies MILP solvers precisely where they provide the greatest advantage over competitors, while relying on more efficient methods for less complex optimisation steps. Thereby, our approach gracefully scales to extremely large query sizes joining up to 100 relations, and consistently achieves the most robust plan quality among a large variety of competing join ordering methods.
Enhanced sector coupling across electricity, mobility, and heating sectors leads to higher efforts for distribution grid upgrades. Based on a case study, this paper evaluates the role of district heating networks in reducing electrical distribution grid reinforcements and compares their economic viability against a building-specific heat supply using heat pumps. A detailed energy system model is used to analyze two building energy renovation scenarios: a business-as-usual scenario with a 1 % annual renovation rate and an ambitious scenario with a rate of 2 %. Using a two-step optimization, the impact of different district heating network penetration levels on the distribution grid is evaluated, followed by an ex-post analysis to incorporate a simultaneity factor into district heating networks. Overall, district heating networks can reduce distribution grid reinforcements, but the associated savings alone do not justify their construction, particularly in the ambitious renovation scenario. In the business-as-usual scenario, a district heating network can reduce reinforcement costs by up to 71 %. However, in the ambitious scenario, grid reinforcements are already reduced due to lower heat peak demand, and the maximal reinforcement cost savings only amount to 35 %. Compared economically, district heating networks are cost-competitive with building-specific heating only in the business-as-usual scenario, up to a heat supply share of 70 % and in the ambitious scenario, up to 40 %. In both scenarios, a district heating network can be a robust solution to lower macroeconomic costs for a carbon-neutral heat supply.
BACKGROUND
The aim of this study was to compare and to validate different dose calculation algorithms for the use in radiation therapy of small lung lesions and to optimize the treatment planning using accurate dose calculation algorithms.
METHODS
A 9-field conformal treatment plan was generated on an inhomogeneous phantom with lung mimics and a soft tissue equivalent insert, mimicking a lung tumor. The dose distribution was calculated with the Pencil Beam and Collapsed Cone algorithms implemented in Masterplan (Nucletron) and the Monte Carlo system XVMC and validated using Gafchromic EBT films. Differences in dose distribution were evaluated. The plans were then optimized by adding segments to the outer shell of the target in order to increase the dose near the interface to the lung.
RESULTS
The Pencil Beam algorithm overestimated the dose by up to 15% compared to the measurements. Collapsed Cone and Monte Carlo predicted the dose more accurately with a maximum difference of -8% and -3% respectively compared to the film. Plan optimization by adding small segments to the peripheral parts of the target, creating a 2-step fluence modulation, allowed to increase target coverage and homogeneity as compared to the uncorrected 9 field plan.
CONCLUSION
The use of forward 2-step fluence modulation in radiotherapy of small lung lesions allows the improvement of tumor coverage and dose homogeneity as compared to non-modulated treatment plans and may thus help to increase the local tumor control probability. While the Collapsed Cone algorithm is closer to measurements than the Pencil Beam algorithm, both algorithms are limited at tissue/lung interfaces, leaving Monte-Carlo the most accurate algorithm for dose prediction.
We measured the cubic nonlinear susceptibility tensor elements ( χ(3)) for polarization along the primary crystallographic axis in titanium-indiffused lithium niobate waveguides, assessed through self-phase modulation using picosecond-duration pulses at telecommunication wavelengths. A dominant, highly temperature- and wavelength-dependent contribution from a cascaded second-order nonlinearity is observed. Through careful extraction of the cascaded effect, we quantify the intrinsic third-order susceptibility tensor elements as χ(3) zzzz(ω; ω, ω, −ω) = (5.2 ± 1.3) × 10−21 m2 V2 and χ(3) xxxx(ω; ω, ω, −ω) = χ(3) yyyy(ω; ω, ω, −ω) = (3.6 ± 0.8) × 10−21 m2 V2 .
These measurements underscore the substantial impact of the cascaded nonlinearity in enhancing the effective cubic nonlinearity in lithium niobate and offer precise values essential for the design of nonlinear photonic devices.
Background
Cervical spine injuries in alpine sports require immediate immobilization at the site of the accident to avoid possible secondary damage caused by transportation. Using special sensor technology, this study investigated whether a cervical spine orthosis (cervical collar, Stifneck collar (Laerdal Medical GmbH, Puchheim, Germany)) provides greater stability than a vacuum mattress alone.
Methods
Using one male test person, we simulated transporting a patient with a spinal injury in steep alpine terrain. A wireless motion capture system (Xsens Technologies, Movella™ Inc., Henderson, USA) was used to record motion in three-dimensional space within a standardized environment. All tests were performed on a set course by the Bavarian Mountain Rescue Service. The test person lay on a mountain rescue stretcher and was immobilized with a vacuum mattress, either with or without a cervical orthosis. The axes of cervical spine movements were analyzed separately.
Results
There were no significant differences between immobilization with and without a cervical orthosis with regard to lateral flexion (max. 3.7° compared to 3.0°) in the frontal plane and maximum excursion in flexion (max. 1.6° compared to 2.8°) or extension (max. -1.6° compared to -1.7°). There was significantly greater rotation movement around the craniocaudal axis without an orthosis (max. 2.4° compared to 1.3°).
Conclusion
During mountain rescues, the cervical spine can be immobilized without a rigid cervical spine orthosis. Future research should explore the fundamental benefits of cervical spine immobilization, while the findings of this work contribute to the safe care of patients by avoiding the disadvantages associated with rigid cervical orthoses.
Deutschland ist eine postmigrantische Gesellschaft. Die damit einhergehende Pluralität beschränkt sich nicht nur auf Herkünfte und Biografien, sondern spiegelt sich auch in Lebensentwürfen und politischen Einstellungen wider, die nicht nur demokratisch, sondern menschenverachtend-völkisch und antidemokratisch sein können. Im Projekt ReTra geht es darum, mehr über Strukturen, Akteur:innen und Aktivitäten von türkischen extrem rechten Organisationen in Bayern zu erfahren. Dabei wird der Einfluss von politischen Veränderungen in der Türkei miteinbezogen. Das Projekt leistet einen Beitrag zur Rechtsextremismusforschung unter Berücksichtigung sich aufspannender transnationaler sozialer Räume. Die qualitative Studie wird ergänzt durch einen partizipativen Forschungsansatz, in dem zivilgesellschaftlich Aktive als Co-Forschende einbezogen werden. Fortlaufend wird die Fachdiskussion mit einschlägigen Wissenschaftler:innen geführt. So entsteht ein Netzwerk, durch das ein kontinuierlicher Forschungs-Praxis Austausch ermöglicht wird. Durch eine Handreichung leistet das Projekt einen Beitrag für die zivilgesellschaftliche Praxis und darüber hinaus für ein demokratisches Miteinander.