OpenAIRE
Refine
Document Type
- Article (66)
- Review (12)
- Book (8)
- Preprint (8)
- Bachelor Thesis (6)
- Doctoral Thesis (6)
- Master's Thesis (3)
- Other digital publishing (3)
- Bookcollection (2)
- Part of Bookcollection (2)
Has Fulltext
- yes (116)
Keywords
- - (57)
- Computer Science, general (11)
- IT in Business (9)
- cyber-physical systems (4)
- Industry 4.0 (3)
- Nachhaltigkeit (3)
- cyber-physical production systems (3)
- machine learning (3)
- Arbeitgeberattraktivität (2)
- Computer Hardware (2)
- Computer Systems Organization and Communication Networks (2)
- Data Structures and Information Theory (2)
- Deep space optical communications (DSOC) (2)
- Informationswissenschaft (2)
- Scattering (2)
- Software Engineering/Programming and Operating Systems (2)
- Studium (2)
- Wärmepumpe (2)
- artificial intelligence performance measures (2)
- autoencoder (2)
- biometrics (2)
- context visualization (2)
- context-aware fault diagnosis (2)
- evidence-based design (2)
- false discovery rate (2)
- fingerprint recognition (2)
- leather (2)
- sustainable development (2)
- validation (2)
- visual analytics (2)
- 2D curve design (1)
- 44A35 (1)
- 5G (1)
- 65D17 (1)
- 65D18 (1)
- 65K10 (1)
- 65K99 (1)
- 65R3 (1)
- 68U07 (1)
- 68U10 (1)
- 90C26 (1)
- 90C39 (1)
- 92C55 (1)
- 94A08 (1)
- 94A12 (1)
- Absentismus (1)
- Absorption (1)
- Ag(acac) (1)
- Alcohol/air combustion (1)
- Application development (1)
- Attenuation (1)
- Avalanche photodiode (APD) (1)
- Baukonstruktion (1)
- Bauteil (1)
- Beam wander (1)
- Bibliothekar*in (1)
- Bibliothekarische Ausbildung (1)
- Bibliothekskatalog (1)
- Bibliotheksverbund (1)
- Biofuels (1)
- Biometric Identification (1)
- Biometric Systems (1)
- Bit rate (1)
- Blow-off (1)
- Büro (1)
- Centralized protection (1)
- Channel capacity (1)
- Circle-preserving scheme (1)
- Clothoid fitting (1)
- Combined characterisation (1)
- Compression (1)
- Computational Workload Reduction (1)
- Constellation diagrams (1)
- Consumer behaviour (1)
- Darmstadt University of Applied Sciences (1)
- Dart (1)
- Data migration (1)
- Databases (1)
- Deconvolution (1)
- Delphi-Studie (1)
- Depression (1)
- Design Methods (1)
- Design Research (1)
- Design Thinking (1)
- Designforschung (1)
- Designmethodik (1)
- Diagnose von Führung zur Selbstführung (1)
- Digital History (1)
- Digital image correlation (1)
- Digitale Lehre (1)
- Discrete hypothesis testing (1)
- Doing Gender (1)
- Downlink (1)
- EROI (1)
- Einsatz und Anwendung des Testverfahrens (1)
- Elektrotechnik (1)
- Emerging trend identification (1)
- Energieversorgung (1)
- Entwurf (1)
- Ethernet (1)
- Extinction strain rate (1)
- FAIR (1)
- FLOSS (1)
- FMEA (1)
- Face Recognition (1)
- False discovery exceedance (1)
- Female Leadership (1)
- Finite element simulation (1)
- Flame length (1)
- Flame surface density (1)
- Flutter (1)
- Forschungsmethoden (1)
- Frauenforschung (1)
- Free space optical communications (1)
- Free space optical communications (FSOC) (1)
- Freedom (1)
- Freeware (1)
- Freiheit (1)
- Full Range of Leadership Modell (1)
- Führungspositionen (1)
- Führungsstil (1)
- GPL (1)
- GPT-3 (1)
- Gebäudesanierung (1)
- Gender Studies (1)
- Geometric Hermite subdivision (1)
- Geostationary orbit (GEO) (1)
- Germany (1)
- Germany heating transition (1)
- Geschlechtsdifferenzierung (1)
- Gestaltungsempfehlungen (1)
- HDAC inhibitors (1)
- HDAC4 (1)
- HDAC8 muteins (1)
- HR recruitment, employer competitions, war for talents (1)
- Heizlast (1)
- Heizleistung (1)
- Heizsystem (1)
- Heizverbrauch (1)
- Hessen (1)
- Hochschule Darmstadt (1)
- Hogg Eco-Anxiety Scale (1)
- Humanities (1)
- Hybride Lehre (1)
- IEC61850 (1)
- IT-System (1)
- Ill-posed inverse problems (1)
- Image segmentation (1)
- Indexing (1)
- Information Fusion (1)
- Information Science (1)
- Information visualization (1)
- Informations- und Kommunikationstechnologie (1)
- Innenausbau (1)
- Innovation (1)
- Innovation intermediation (1)
- Innovation management (1)
- Intensity modulation (1)
- Interaction design (1)
- Joint reconstruction and segmentation (1)
- KEA (1)
- Kataloganreicherung (1)
- Klimaanpassung (1)
- Klimaschutz (1)
- Knowledge bases (1)
- Knowledge transfer (1)
- Kratzempfindlichkeit (1)
- Kunststoff (1)
- Kältemaschine (1)
- Laser (1)
- Leather (1)
- Lehrveranstaltung (1)
- Lewis-number effects (1)
- Librarian (1)
- Licence models (1)
- Link budget (1)
- Lizenzmodelle (1)
- Low earth orbit (LEO) satellites (1)
- Majorization–minimization methods (1)
- Medienforschung (1)
- Messung (1)
- Mindset (1)
- Mobile application (1)
- Mobile applications (1)
- Mobilitätswende (1)
- Monte Carlo (1)
- Motive (1)
- Multimedia interaction (1)
- NFDI (1)
- NMR (1)
- Netzentwicklung (1)
- NoSQL (1)
- Non-linear subdivision (1)
- Nutzungsintention (1)
- Nutzungsrechte (1)
- OPC UA (1)
- OXTR methylation (1)
- Offenheit (1)
- Open Definition (1)
- Open Hardware (1)
- Open Source Software (1)
- Open source software (1)
- Open-Source-Software (1)
- OpenAI (1)
- Openness (1)
- Oxytocin (1)
- PRISMA statement (1)
- PV-Modul (1)
- Pendeln (1)
- Personalauswahl (1)
- Personalmarketing (1)
- Photon counting (1)
- Photon efficiency (1)
- Piecewise constant Mumford–Shah model (1)
- Politik (1)
- Politische Bildung (1)
- Polymers (1)
- Polypropylen (1)
- Potts model (1)
- Protection systems (1)
- Präsentismus (1)
- Psychotherapy (1)
- Pt(acac) (1)
- Pulse position modulation (PPM) (1)
- REACH Regulation (1)
- Radon transform (1)
- Rechtsrahmen (1)
- Regulatory impact assessment (1)
- Reibung (1)
- Reliabilität und Validität von Messwerten (1)
- Research Data Management (1)
- Research Infrastrukture (1)
- Rezyklatkunststoffe (1)
- Right to know (1)
- Rights of use (1)
- SVHCs (1)
- Schema evolution (1)
- Schutztechnik (1)
- Scintillation (1)
- Self management (1)
- Self-adapting (1)
- Self-leadership, (1)
- Sensorless control (1)
- Sentiment-Analyse (1)
- Skalenentwicklung - Validität (1)
- Small wins (1)
- Smart Grid (1)
- Software law (1)
- Software licences (1)
- Softwarelizenzen (1)
- Softwarerecht (1)
- Solarzelle (1)
- Soziale Arbeit (1)
- Speed-Pedelecs (1)
- Sprachtechnologie (1)
- Stadtbahnviadukt Berlin (1)
- Stadtentwicklung (1)
- Stadtplanung (1)
- Stadtviertel (1)
- Standardisierter Fragebogen (1)
- State management (1)
- Stress (1)
- Struktur (1)
- Städtebau (1)
- Substances in articles (1)
- Sun-Earth-Probe angle (1)
- Sun–earth-probe angle (SEP Angle) (1)
- Supply chain communication (1)
- Sustainability transition (1)
- Sustainable Development (1)
- Synchronization (1)
- Technology management (1)
- Technostress (1)
- Telescope (1)
- Tension (1)
- Theorie der sozialen Identität (1)
- Time-Sensitive Networking (1)
- Transdisziplinär (1)
- Transparency instruments (1)
- Turbulent combustion (1)
- Twitter (1)
- Umbau (1)
- Uplink (1)
- Verschleiß (1)
- Viral effect (1)
- Viraler Effekt (1)
- Visual analytics (1)
- Visual trend analytics (1)
- Weighted hypothesis testing (1)
- Weiterbildung (1)
- Workshop (1)
- Wärmemenge (1)
- absenteeism (1)
- advertising (1)
- affect transfer (1)
- allosteric regulation (1)
- artificial intelligence (1)
- artificial neural network (1)
- autoencoder ensemble (1)
- automated journalism (1)
- behavioral change (1)
- behavioral consequences (1)
- berufliche Entwicklung (1)
- bibliometric analysis (1)
- binding mechanism (1)
- binding selectivity (1)
- biometric (1)
- biometric performance (1)
- biotin switch assay (1)
- bootstrapping (1)
- brand attitudes (1)
- brand placement (1)
- catalogue enrichment (1)
- catalysis (1)
- children (1)
- climate anxiety (1)
- climate change (1)
- climate crisis (1)
- co-authorship network (1)
- coherent diffractive imaging (1)
- conformation sensitive mass spectrometry (1)
- conformational (1)
- conformational equilibrium (1)
- conformations (1)
- consumer behavior (1)
- consumption motives (1)
- contactless fingerprint (1)
- context-aware diagnosis (1)
- covalent inactivators (1)
- data annotation (1)
- decision making (1)
- defaults (1)
- design methodology (1)
- design primes (1)
- digestive ripening (1)
- digital (1)
- discrete hypothesis testing (1)
- discrete p-values (1)
- discrete q-values (1)
- district heating (1)
- drug–target interaction (1)
- eco-anxiety (1)
- encoderless control (1)
- energy (1)
- environmental crisis (1)
- evaluative conditioning (1)
- expectancy theory (1)
- extrusion (1)
- face recognition (1)
- fault amendment (1)
- fault classification (1)
- fault detection (1)
- fault diagnosis (1)
- fault modes (1)
- fault prioritization (1)
- fingerprint (1)
- full range of leadership model (1)
- generalisable feature spaces (1)
- generative adversarial networks (1)
- genetic code expansion (1)
- glass plate (1)
- glaucoma (1)
- glaucoma suspects (1)
- goal-directed behavior (1)
- ground truth (1)
- health behaviour change (1)
- health belief model (1)
- heterogeneous data (1)
- histone deacetylases (1)
- historical research (1)
- home (1)
- homogeneous p-values (1)
- hybrid forecasting methods (1)
- hydrogen (1)
- ideation support (1)
- image processing (1)
- impact (1)
- impedance measurement (1)
- information and communication technology (1)
- information science (1)
- integration (1)
- internationality (1)
- intervention (1)
- inverse problems (1)
- laminA/C (1)
- laminB1 (1)
- large language models (1)
- laser–matter interactions (1)
- latex editor (1)
- leadership style (1)
- library science training (1)
- ligand binding (1)
- magnetic anisotropy (1)
- magnetic particle imaging (1)
- material priming (1)
- memory (1)
- mental health literacy (1)
- metamodeling (1)
- microscopy (1)
- model-based reconstruction (1)
- modeling (1)
- monodisperse nanoparticles (1)
- multi-level perspective (1)
- multidisciplinarity (1)
- multivalent ligands (1)
- nanoplasma expansion (1)
- natural language processing (1)
- neutron (1)
- news perception (1)
- nuclear lamina (1)
- nucleophilic aromatic substitution (1)
- nudging (1)
- object separation (1)
- occurrence (1)
- online multiple testing (1)
- organizational attractiveness – employer of choice – job advertisements – social identity theory – instrumental and symbolic factors of attraction (1)
- outlier detection (1)
- participation (1)
- path dependencies (1)
- phantom (1)
- phase space (1)
- photo-crosslinking (1)
- physical relaxation (1)
- polymers (1)
- pre-training variable selection (1)
- presentation attac detection (1)
- presentation attack detection (1)
- presenteeism (1)
- pro-environmental behavior (1)
- proactive (1)
- process (1)
- production (1)
- production-relevant variables (1)
- properties (1)
- protein–ligand binding (1)
- proteomic studies (1)
- psychology (1)
- psychosocially-supportive design (1)
- publication recommendations (1)
- purchasing behavior (1)
- purification (1)
- real-time recommendation systems (1)
- recommendation systems (1)
- recycling process optimization (1)
- reluctance synchronous machine (1)
- restorative cities (1)
- salutogenesis (1)
- saturation (1)
- scholarly communication (1)
- scratch testing (1)
- self-regulation (1)
- semantic common feature spaces (1)
- sensor (1)
- similarity algorithms (1)
- size distribution (1)
- smart manufacturing (1)
- software (1)
- stage models (1)
- standardization (1)
- stress (1)
- strong-field ionization (1)
- structure (1)
- students’ evaluation of satisfaction and stress (1)
- substrates (1)
- surface (1)
- surface analysis (1)
- sustainable chemistry (1)
- sustainable tourism (1)
- symbolische und instrumentelle Attraktivitätsfaktoren (1)
- synthetic (1)
- synthetic data (1)
- systematic review (1)
- tailored information (1)
- technostress (1)
- tensile properties (1)
- thermo stability (1)
- thick target yield (1)
- thiol modification (1)
- time-resolved diffraction (1)
- toolchain (1)
- topics modeling (1)
- total variation (1)
- traceability (1)
- transdisciplinary ideation (1)
- transformation of energy systems (1)
- transglutaminase (1)
- transient binding pockets (1)
- transparency (1)
- turing test (1)
- type I error rate control (1)
- unknown attacks (1)
- unnatural amino acids (1)
- urban environment (1)
- urban mental health (1)
- usability (1)
- vacation (1)
- video games (1)
- voluntary (1)
- war for talents (1)
- weibliche Führungskräfte (1)
- weighted p-values (1)
- ältere Mitarbeiter (1)
- α-investing (1)
Institute
- Gesellschaftswissenschaften (25)
- Informatik (16)
- Media (16)
- Elektrotechnik und Informationstechnik (11)
- Wirtschaft (11)
- Mathematik und Naturwissenschaften (6)
- Chemie- und Biotechnologie (5)
- Maschinenbau und Kunststofftechnik (5)
- Promotionszentrum Angewandte Informatik (5)
- da/sec - Biometrics and Internet Security Research Group (5)
In order to generate a machine learning algorithm (MLA) that can support ophthalmologists with the diagnosis of glaucoma, a carefully selected dataset that is based on clinically confirmed glaucoma patients as well as borderline cases (e.g., patients with suspected glaucoma) is required. The clinical annotation of datasets is usually performed at the expense of the data volume, which results in poorer algorithm performance. This study aimed to evaluate the application of an MLA for the automated classification of physiological optic discs (PODs), glaucomatous optic discs (GODs), and glaucoma-suspected optic discs (GSODs). Annotation of the data to the three groups was based on the diagnosis made in clinical practice by a glaucoma specialist. Color fundus photographs and 14 types of metadata (including visual field testing, retinal nerve fiber layer thickness, and cup–disc ratio) of 1168 eyes from 584 patients (POD = 321, GOD = 336, GSOD = 310) were used for the study. Machine learning (ML) was performed in the first step with the color fundus photographs only and in the second step with the images and metadata. Sensitivity, specificity, and accuracy of the classification of GSOD vs. GOD and POD vs. GOD were evaluated. Classification of GOD vs. GSOD and GOD vs. POD performed in the first step had AUCs of 0.84 and 0.88, respectively. By combining the images and metadata, the AUCs increased to 0.92 and 0.99, respectively. By combining images and metadata, excellent performance of the MLA can be achieved despite having only a small amount of data, thus supporting ophthalmologists with glaucoma diagnosis.
The benefits of ideation for both industry and academia alike have been outlined by countless studies, leading to research into various approaches attempting to add new ideation methods or examine how the quality of the ideas and solutions created can be measured. Although AI-based approaches are being researched, there is no attempt to provide the ideation participants with information that inspire new ideas and solutions in real time. Our proposal presents a novel and intuitive approach that supports users in real time by providing them with relevant information as they conduct ideation. By analyzing their ideas within the respective ideation sessions, our approach recommends items of interest with high contextual similarity to the proposed ideas, allowing users to skim through, for example, publications and inspire new ideas quickly. The recommendations also evolve in real time. As more ideas are written during the ideation session, the recommendations become more precise. This real-time approach is instantiated with various ideation methods as a proof of concept, and various models are evaluated and compared to identify the best model for working with ideas.
The kinetics and mechanism of drug binding to its target are critical to pharmacological efficacy. A high throughput (HTS) screen often results in hundreds of hits, of which usually only simple IC50 values are determined during reconfirmation. However, kinetic parameters such as residence time for reversible inhibitors and the kinact/KI ratio, which is the critical measure for evaluating covalent inactivators, are early predictive measures to assess the chances of success of the hits in the clinic. Using the promising cancer target human histone deacetylase 8 as an example, we present a robust method that calculates concentration-dependent apparent rate constants for the inhibition or inactivation of HDAC8 from dose–response curves recorded after different pre-incubation times. With these data, hit compounds can be classified according to their mechanism of action, and the relevant kinetic parameters can be calculated in a highly parallel fashion. HDAC8 inhibitors with known modes of action were correctly assigned to their mechanism, and the binding mechanisms of some hits from an internal HDAC8 screening campaign were newly determined. The oxonitriles SVE04 and SVE27 were classified as fast reversible HDAC8 inhibitors with moderate time-constant IC50 values of 4.2 and 2.6 µM, respectively. The hit compound TJ-19-24 and SAH03 behave like slow two-step inactivators or reversible inhibitors, with a very low reverse isomerization rate.
Für das Erreichen der Klimaneutralität 2045 in Deutschland hat die Bundesregierung 2022 ein Gesetz vorgelegt, welches Kommunen ab 20.000 Einwohner verpflichtet einen kommunalen Wärmebedarfsplan aufzustellen. Eine Wärmebedarfsplan beinhaltet den aktuellen, sowie den zukünftigen Wärmebedarf der Kommune. Außerdem werden Potenziale für die Erzeugung erneuerbarer Energien mit Wärmepumpen ermittelt (Landes Energie Agentur Hessen, 2024). Aus den Potenzialen lassen sich innerhalb der Kommune stadtteil-/ oder gebäudespezifisch Teilmaßnahmen für eine mögliche Umsetzung ableiten. Diese Maßnahmen bestehen aus Sanierungen, dem Aufbau und der Erweiterung von Wärmeverbundlösungen und spezifische Einzellösungen. Zudem dient die Potenzialermittlung einer zukünftigen effizienten Koordination von Planung, Umsetzung und Förderung. Über die Stadtwerke können die Kommunen zielgerichtet Wärmenetze neu bauen oder ausbauen (Landes Energie Agentur Hessen, 2020).
Biometric fingerprint identification hinges on the reliability of its sensors; however, calibrating and standardizing these sensors poses significant challenges, particularly in regards to repeatability and data diversity. To tackle these issues, we propose methodologies for fabricating synthetic 3D fingerprint targets, or phantoms, that closely emulate real human fingerprints. These phantoms enable the precise evaluation and validation of fingerprint sensors under controlled and repeatable conditions. Our research employs laser engraving, 3D printing, and CNC machining techniques, utilizing different materials. We assess the phantoms’ fidelity to synthetic fingerprint patterns, intra-class variability, and interoperability across different manufacturing methods. The findings demonstrate that a combination of laser engraving or CNC machining with silicone casting produces finger-like phantoms with high accuracy and consistency for rolled fingerprint recordings. For slap recordings, direct laser engraving of flat silicone targets excels, and in the contactless fingerprint sensor setting, 3D printing and silicone filling provide the most favorable attributes. Our work enables a comprehensive, method-independent comparison of various fabrication methodologies, offering a unique perspective on the strengths and weaknesses of each approach. This facilitates a broader understanding of fingerprint recognition system validation and performance assessment.
We address the need for a large-scale database of children’s faces by using generative adversarial networks (GANs) and face-age progression (FAP) models to synthesize a realistic dataset referred to as “HDA-SynChildFaces”. Hence, we proposed a processing pipeline that initially utilizes StyleGAN3 to sample adult subjects, which is subsequently progressed to children of varying ages using InterFaceGAN. Intra-subject variations, such as facial expression and pose, are created by further manipulating the subjects in their latent space. Additionally, this pipeline allows the even distribution of the races of subjects, allowing the generation of a balanced and fair dataset with respect to race distribution. The resulting HDA-SynChildFaces consists of 1,652 subjects and 188,328 images, each subject being present at various ages and with many different intra-subject variations. We then evaluated the performance of various facial recognition systems on the generated database and compared the results of adults and children at different ages. The study reveals that children consistently perform worse than adults on all tested systems and that the degradation in performance is proportional to age. Additionally, our study uncovers some biases in the recognition systems, with Asian and black subjects and females performing worse than white and Latino-Hispanic subjects and males.
District heating plays a key role in the German heat transition (“Wärmewende”) to achieve climate protection targets. In order to realise the heating transition, the legislator has established cost efficiency as a central criterion in the relevant legislation. Ecology, as the third pillar of sustainability, is thus taking a back seat, despite the transformation’s influence on other sustainability dimensions beyond climate protection.
The article takes an ecological perspective on the district heating
transformation and shows that, from this perspective, greater emphasis
should be placed on local environmental heat and large heat pumps.
In the second step, the decentralised information available on the
actual transformation plans of district heating suppliers is aggregated and evaluated at a national level for the first time. The evaluation indicates a possible gap between the developed sustainable target state and the plans of district heating suppliers, which are primarily focussed on the cost efficiency criterion. This comparison identifies a potential conflict of objectives between the legislative cost efficiency criterion and the ecological sustainability perspective.
In this paper, we present a new processing method, called MOSES—Impacts, for the detection of micrometer-sized damage on glass plate surfaces. It extends existing methods by a separation of damaged areas, called impacts, to support state-of-the-art recycling systems in optimizing their parameters. These recycling systems are used to repair process-related damages on glass plate surfaces, caused by accelerated material fragments, which arise during a laser–matter interaction in a vacuum. Due to a high number of impacts, the presented MOSES—Impacts algorithm focuses on the separation of connected impacts in two-dimensional images. This separation is crucial for the extraction of relevant features such as centers of gravity and radii of impacts, which are used as recycling parameters. The results show that the MOSES—Impacts algorithm effectively separates impacts, achieves a mean agreement with human users of (82.0 ± 2.0)%, and improves the recycling of glass plate surfaces by identifying around 7% of glass plate surface area as being not in need of repair compared to existing methods.
Human histone deacetylase 4 (HDAC4) is a key epigenetic regulator involved in a number of important cellular processes. This makes HDAC4 a promising target for the treatment of several cancers and neurodegenerative diseases, in particular Huntington's disease. HDAC4 is highly regulated by phosphorylation and oxidation, which determine its nuclear or cytosolic localization, and exerts its function through multiple interactions with other proteins, forming multiprotein complexes of varying composition. The catalytic domain of HDAC4 is known to interact with the SMRT/NCOR corepressor complex when the structural zinc-binding domain (sZBD) is intact and forms a closed conformation. Crystal structures of the HDAC4 catalytic domain have been reported showing an open conformation of HDAC4 when bound to certain ligands. Here, we investigated the relevance of this HDAC4 conformation under physiological conditions in solution. We show that proper zinc chelation in the sZBD is essential for enzyme function. Loss of the structural zinc ion not only leads to a massive decrease in enzyme activity, but it also has serious consequences for the overall structural integrity and stability of the protein. However, the Zn2+ free HDAC4 structure in solution is incompatible with the open conformation. In solution, the open conformation of HDAC4 was also not observed in the presence of a variety of structurally divergent ligands. This suggests that the open conformation of HDAC4 cannot be induced in solution, and therefore cannot be exploited for the development of HDAC4-specific inhibitors.
The development of compact neutron sources for applications is extensive and features many approaches. For ion-based approaches, several projects with different parameters exist. This article focuses on ion-based neutron production below the spallation barrier for proton and deuteron beams with arbitrary energy distributions with kinetic energies from 3 MeV to 97 MeV. This model makes it possible to compare different ion-based neutron source concepts against each other quickly. This contribution derives a predictive model using Monte Carlo simulations (an order of 50,000 simulations) and deep neural networks. It is the first time a model of this kind has been developed. With this model, lengthy Monte Carlo simulations, which individually take a long time to complete, can be circumvented. A prediction of neutron spectra then takes some milliseconds, which enables fast optimization and comparison. The models’ shortcomings for low-energy neutrons (<0.1 MeV
) and the cut-off prediction uncertainty (±3 MeV
) are addressed, and mitigation strategies are proposed.
A growing body of literature mainly in the context of consumer research indicates that the formal-aesthetic and conceptual design of objects can influence users' thoughts, emotions and even behavioural patterns. While there is strong evidence regarding these effects on actual purchasing decisions, evidence on the effect of aesthetic design features (e.g., haptics, colour) on health-related mental concepts and intentions for health behaviour change is scarce. Based on insights from material and conceptual priming, this article illustrates the research-driven and evidence-based design process of two design primes and comprises pre-tests and an experiment in two settings on the effect of design on health behaviour focusing i.a. on intention for health behaviour change. In an evidence-based and research-driven process, two lecterns were designed to work as primes, i.e., to have a positive vs. negative influence on several mental constructs (sense of control, sense of coherence, resiliency, self-efficacy) and health-related intention. The lecterns differed mainly in terms of aesthetic appearance (e.g., material, colour, proportion, steadiness). They were tested in (a.) a university setting with students (n = 83) and (b.) a clinical setting with orthopaedic rehabilitation patients (n = 38). Participants were asked to perform an unrelated task (evaluation of an unrelated product) while standing at and using the lecterns. Overall, t-tests and Mann-Whitney-U tests show no significant differences but differing tendencies in a mentioning task. When asked to name health-promoting activities, in the clinical setting, participants using the "positive" prime (i.e., the steady lectern, n = 13) mentioned more sport-related aspects on average and a higher portion of sport-related aspects of their answers than participants using the "negative" prime (n = 11). In the university setting (positive: n = 36; negative n = 38), no such differences emerged. This finding gives reason to believe that the prime might be specifically effective in the clinical setting as it relates to physical activity being the most relevant topic of the patients' pathology.
The leather industry is a complex system with multiple actors that faces a fundamental transition toward more sustainable chemistry. To support this process, this article analyzes challenges of the industry and consumers’ roles as a nexus of transition-relevant developments. We present findings of an empirical study (N = 439) among consumers on their perception of leather, related knowledge, and purchasing behavior. We found that participants perceived leather as natural, robust, and of high quality. Knowledge about the manufacturing of leather products was overall limited but varied. Applying a psychological behavior theory, we found that being aware of environmental and health consequences from conventional manufacturing of leather products was positively associated with a personal norm to purchase leather products that are less harmful to environment and health. The perceived ease of buying such products was positively associated with their purchase. Our findings shed light on consumers’ roles in the current leather system and their support of niche innovations toward more sustainable chemistry. Against this backdrop, we discuss implications for product design, consumer information, and needs for traceability along supply chains.
Technostress – d. h. Stress, der aus dem Umgang mit digitalen Technologien resultiert – ist eine gravierende Schattenseite der voranschreitenden Digitalisierung der Arbeitswelt. Die negativen Auswirkungen dieses Phänomens sind bereits heute erkennbar. Sie beinhalten sowohl negative gesundheitliche Folgen für die betroffenen Mitarbeiter_innen als auch gravierende Folgekosten für Unternehmen durch gesteigerte Fehlzeiten sowie negative Auswirkungen auf Mitarbeiterproduktivität und -zufriedenheit. Die vorliegende Studie untersucht, ob das Führungsverhalten einer Führungskraft die Entstehung von Technostress bei den ihr direkt unterstellten Mitarbeiter_innen beeinflusst. Darüber hinaus werden Einflüsse weiterer individueller und organisationaler Faktoren überprüft. Mittels validierter Erhebungsinstrumente werden selbstberichtete Daten von N=849 Mitarbeiter_innen deutscher Unternehmen erhoben. Die Einschätzung des Führungsverhaltens der direkten Führungskraft erfolgt auf Grundlage der Führungsstile des „Full Range of Leadership Modells“ nach Avolio und Bass (1991) unter Zuhilfenahme des MLQ 5x short. Die Ergebnisse der Datenauswertung mittels Strukturgleichungsmodellierung weisen darauf hin, dass das Führungsverhalten der bzw. des direkten Vorgesetzten Einfluss auf das Technostress-Empfinden der Mitarbeiter_innen hat.
Solar phase scintillation and solar amplitude scintillation are fundamentally important in deep space mission operations for designing a communication system capable of transmitting signals when the signal path is close to the Sun. The ESA’s BepiColombo measurement data were analyzed in a previous paper in terms of the power spectral density of the solar phase scintillation, also with a comparison with Woo’s solar phase scintillation theory, when X-band and Ka-band signals propagate close to the Sun with a small Sun-Earth-Probe (SEP) angle during the superior solar conjunction campaign in March 2021 in its cruise phase to Mercury. In this paper the solar amplitude scintillation is analyzed both by calculating the power spectral density and the scintillation index. The results of scintillation index, derived from these measurement data, fit the NASA JPL’s scintillation index model.
With regard to AI as a key technology, this scientific paper deals with the identification of user drivers on the purchase decision of a cooperative AI (as explainable AI—XAI), as well as the analysis of the willingness to pay in the context of value-based pricing. Besides the economic dimension with regard to usefulness and usability of the system, the focus is mainly on the (innovative) explainable character. The analysis is carried out by a choice-based conjoint analysis (CBC) using the example of an intelligent assistance system for employees that supports internal business processes and workflows in business organizations. For this purpose, fictitious purchase offers were created under which decision-makers in manufacturing business organizations in Germany made simulated purchase decisions. The analysis shows that the target group attach great utility value to transparency in the sense of explanatory content, in addition to a high degree of interactivity and a high level of reliability.
Random Forests are a powerful and frequently applied Machine Learning tool. The permutation variable importance (VIMP) has been proposed to improve the explainability of such a pure prediction model. It describes the expected increase in prediction error after randomly permuting a variable and disturbing its association with the outcome. However, VIMPs measure a variable’s marginal influence only, that can make its interpretation difficult or even misleading. In the present work we address the general need for improving the explainability of prediction models by exploring VIMPs in the presence of correlated variables. In particular, we propose to use a variable’s residual information for investigating if its permutation importance partially or totally originates from correlated predictors. Hypotheses tests are derived by a resampling algorithm that can further support results by providing test decisions and p-values. In simulation studies we show that the proposed test controls type I error rates. When applying the methods to a Random Forest analysis of post-transplant survival after kidney transplantation, the importance of kidney donor quality for predicting post-transplant survival is shown to be high. However, the transplant allocation policy introduces correlations with other well-known predictors, which raises the concern that the importance of kidney donor quality may simply originate from these predictors. By using the proposed method, this concern is addressed and it is demonstrated that kidney donor quality plays an important role in post-transplant survival, regardless of correlations with other predictors.
Evaluation of the Explanatory Power Of Layer-wise Relevance Propagation using Adversarial Examples
(2023)
Approaches for visualizing and explaining the decision process of convolutional neural networks (CNNs) have recently received increasing attention. Particularly popular approaches are so-called saliency methods, which aim to assign a valence to each input pixel based on its importance and influence on the classification via saliency maps. In our paper, we contribute by a novel analyzing approach build on adversarial examples to investigate the explanatory power of saliency methods exemplified by layer-wise relevance propagation (LRP). Based on the hypothesis that distinct decisions, such as an image’s classification and the classification of its corresponding adversarial examples, should yield to dissimilar saliency maps to provide transparent rationales, we break down relevance scores of images and corresponding adversarial examples and analyze them using a comprehensive statistical evaluation. It turns out that different relevance decomposition rules of LRP do not lead to clearly distinguishable saliency maps for images and corresponding adversarial examples, neither in terms of their contour lines, nor in terms of the statistical analysis.
Der Begriff und das Thema Nachhaltigkeit haben sich in der Sozialen Arbeit etabliert. Offen bleibt bisher, was genau mit Nachhaltigkeit verbunden wird und in welchen Kontexten Sozialer Arbeit welche Bezüge zur Nachhaltigkeitsdebatte aufgerufen werden. Der vorliegende Einführungsbeitrag in den Schwerpunkt Nachhaltigkeit – ein Thema (in) der Sozialen Arbeit will aufzeigen und systematisieren, was Nachhaltigkeit für die Soziale Arbeit derzeit bedeutet und zukünftig bedeuten kann oder soll. Neben einer theoretischen Auseinandersetzung gewährt er exemplarische Einblicke in die Nachhaltigkeitsdebatten in Praxisfeldern Sozialen Arbeit, identifiziert Impulse für Theorieentwicklung und Handlungskonzepte der Sozialen Arbeit und führt in die Beiträge des Schwerpunktes ein.
Artificial Intelligence in studies—use of ChatGPT and AI-based tools among students in Germany
(2023)
AI-based tools such as ChatGPT and GPT-4 are currently changing the university landscape and in many places, the consequences for future forms of teaching and examination are already being discussed. In order to create an empirical basis for this, a nationwide survey of students was carried out in order to analyse the use and possible characteristics of AI-based tools that are important to students. The aim of the quantitative study is to be able to draw conclusions about how students use such AI tools. A total of more than 6300 students across Germany took part in the anonymous survey. The results of this quantitative analysis make it clear that almost two-thirds of the students surveyed use or have used AI-based tools as part of their studies. In this context, almost half of the students explicitly mention ChatGPT or GPT-4 as a tool they use. Students of engineering sciences, mathematics and natural sciences use AI-based tools most frequently. A differentiated examination of the usage behaviour makes it clear that students use AI-based tools in a variety of ways. Clarifying questions of understanding and explaining subject-specific concepts are the most relevant reasons for use in this context.
A novel material testing concept is developed in order to provide tensile and compressive properties within a single mechanical test. A new specimen geometry is designed for testing in a universal testing machine. Under tensile load, both a homogeneous tensile stress condition as well as a homogeneous compressive stress condition occur in the specimen. Measurements accompanying the experimental test with digital image correlation provide tensile and compressive Poisson’s ratio as well as tensile modulus. These properties are input parameters for subsequent finite element simulations. The compressive modulus is determined by iteratively adjusting finite element simulations in order to couple experimental and simulated results. For validating the concept, experimental tests are carried out on polyoxymethylene. While the tensile Poisson’s ratio of the new concept shows the best agreement with the reference value, the compressive modulus is approximately 15% higher. Further work should focus on an appropriate material model in order to reduce the deviation.
A growing number of economic geography scholars have discussed the
spatial dimensions of sustainability transitions (STs), which entail radical changes in socio-technical systems to overcome societal, economic, and ecological problems. This involves innovation processes with a broad range of distinctive actors. Innovation intermediaries, such as universities and research institutes, are needed to support and accelerate the transfer of knowledge. Nevertheless, little is known about the influence of such actors on the configuration of the knowledge bases required for STs. This article presents insights from 14 semi-structured interviews with experts conducted in a regional innovation system (RIS) in East Germany. In cooperation with the Eberswalde University for Sustainable Development, we investigate four innovation intermediaries in the region of Eberswalde. The analytical framework links the concept of differentiated knowledge bases to small wins. Our results show that, first, in the Eberswalde region, the relevant actors involved in regional knowledge transfer focus predominantly on synthetic knowledge bases, such as experiencebased knowledge of local area settings. Second, symbolic knowledge bases are crucial and often prerequisites for intermediary organizations to recombine knowledge bases and support the capability to innovate in regional knowledge transfer. Symbolic knowledge entails the ability to translate scientific findings to a language that can be understood by the various actors in knowledge transfer. Third, changes in organizational structures complement changes in cultural–cognitive and normative institutions to support innovation on a systemic level and foster change processes.
Rezension „Cyber-Sicherheit“
(2023)
The manufacturing industry is undergoing a transformation marked by the emergence of Industry 4.0 and Industry 5.0 paradigms, which are characterized by the integration and automation of machinery. Thereby, the machinery evolves into Cyber-Physical Systems (CPSs). These CPSs consist of software and hardware modules implementing complex manufacturing processes. The ongoing integration of machinery, and external technologies, e.g., the Industrial Internet of Things (IIoT), led to an evolving Smart Manufacturing (SM) environment. At the same time, legacy machinery, the brownfield machinery, exists side-by-side with modern CPSs. The brownfield machinery might be integrated by retrofitting in the modern manufacturing process. Therefore, the evolution of the SM domain thriven by the Industry 4.0 and Industry 5.0 paradigms leads to a more complex SM environment. Moreover, the integration and ongoing adaption of technologies and processes introduce novel relationships and dependencies between employed machinery and systems. Fault Diagnosis (FD) in such a complex SM environment becomes more time-consuming and laborious. A side effect of the ongoing evolution is the advancing capabilities of the machinery and the ability to produce data. Therewith, not only complex data has to be analyzed during any FD but also vast quantities. The search for the origin of the fault is challenging. Additionally, technical challenges in the SM environment hinder a thorough FD. For instance, the available bandwidth for data transmission is unequal to the capabilities of the machinery to produce vast data quantities. Therefore, the application challenge exists to focus on specific areas of the SM environment while choosing a reasonable granularity in data surveillance to cover the fault traces without losing too much information. Thereby, any FD depends heavily on the domain knowledge of the professionals entrusted with the FD task. On top, there is also economic pressure, which raises the tension on the employed professionals as an unexpected downtime, and the loss in production quantity equals the economic loss.
The thesis introduces context-aware FD to mitigate the risen complexity of the SM environment and support the professionals in their work. By supporting the professionals, the time for FD can be reduced, which results in faster fault amendment and reduced cost-intensive production downtimes. The Context-Aware Diagnosis in Smart Manufacturing (TAOISM) Visual Analytics (VA) model backs the context-aware FD. The TAOISM VA model is the theoretical foundation for the context-aware FD and defines the data layer, the models layer, the visualization layer, and the knowledge layer for SM. Hereby, the VA model enables the definition of context, context models, and context hierarchies for their integration in the respective layers. The main idea behind the context-aware FD is to use the narrowing character of the context definition to slice vast amounts of data into manageable context-separated data groups. Thereby, the context model works as a virtual boundary across machinery and systems, which encloses the physical domain (hardware) and the immaterial domain (software) equally. Further, the thesis focuses on contextual faults, which arise from context model violations, and proposes approaches for collecting contextual data. Also, the automated building of context models and the extraction and transformation of contextual data is part of the thesis. Employing the context models impacts each layer of the proposed TAOISM VA model. For each layer, various approaches show the impact of the context models and their employment in three different application scenarios for FD in SM. The performed research is tested and verified in the scenarios of Robotics Application Development (RAD), Maintenance of Industrial Inspection Machines (MIIM) and Abnormal Event Management in Production Lines (AEMPL). Along with employing context models, data augmentation with context models is proposed. Along with other benefits, the presented data augmentation technique has the ability to balance undersampled datasets, which would enable a reduction of data recordings for any context-aware FD in the future. Thereby, the data augmentation technique is to answer the existing inaccuracies in an SM environment, which also impacts the quality of any employed Artificial Intelligence (AI). Another approach targets the unsupervised selection of production-relevant variables to focus FD-related data recordings and surveillance on areas of the SM environment active during production automatically without any domain knowledge involved. The hypothesis, which was proven right, was that faults, especially contextual faults, occur more often on active software and hardware modules. Another challenge from the vast amount of data is that labeling data for AI becomes uneconomical, even for small fault cases in AI. As a result, evaluating any AI model in SM becomes challenging, as standard measures, e.g., accuracy, precision, recall, and F1-score, cannot be applied. In this case, the thesis proposes novel AI performance metrics that decouple comparability and correctness to enable the evaluation of AI models in an SM environment. All the contributions have led to the development of two distinct Proof of Concepts (PoCs). The PoCs are the reference implementation of the context-aware FD and reflect a knowledge-based FD Expert System (ES) and an unsupervised data-driven FD system. The latter was part of the thorough evaluation of the context-aware FD by two groups of domain experts and junior professionals. The successful qualitative evaluation not only hints towards a working context-aware FD but also unveils future research directions and a future vision for SM. Additional domain expert interviews expose the views on the relevancy of a context-aware FD in SM for the future. In general, the evaluation hints towards a context-aware FD, which has versatile applicability, usability, and suitability in SM-related FD.
The overall objective of this dissertation is to enable a more efficient and effective point cloud and mesh partition for artists and 3D application developers. In this dissertation, 3D scans are assumed as the source data material of the 3D application development, reducing the manual and time-consuming modelling of virtual objects. Furthermore, the scanned data is assumed to be processed to a point cloud and reconstructed to a polygon mesh. The mesh has to be partitioned into the objects of interest to design specific interactions with a game engine. Interviews revealed that the partition is manually conducted on a mesh with a 3D manipulation software, which is time-consuming. The partition creation should be automated to increase efficiency and effectiveness. Freely available point cloud and mesh partition algorithms require an expert with appropriate programming skills and field knowledge, which makes them difficult to use. More precisely, the algorithms cannot be used in existing workflows as they are not implemented in a common graphical 3D manipulation software. Beneath these problems, the partition automation should work on real-world data and have a low runtime to raise efficiency. Different sub-research objectives were formulated from these problems and requirements, leading to novel approaches in the domains of: (a) sequential partition creation with deep reinforcement and imitation learning, (b) episodic partition creation with graph neural networks, (c) match-based reward calculation and (d) synthetic scene generation. One sub-research objective is the replacement of a human expert with an agent. In this context, a novel deep reinforcement learning (DRL) partition framework is presented. Experiments were conducted using this framework combined with the region growing algorithm and synthetic scenes created by a self-developed scene generator. The maximum reward could almost be achieved with a fine-tuned PointNet and by evaluating the wall and non-wall objects separately. This approach is not applicable to real-world scenes, which is necessary to achieve the efficiency and effectiveness objective. Therefore, another DRL partition approach is introduced, where an agent unifies superpoints in the so-called superpoint growing environment. The point cloud is divided into superpoints, which will be unified into the objects of interest by an agent. The experimental results show that this approach can be applied to real-world scenes. Beneath the application of DRL, an imitation learning approach was developed, increasing the agent’s performance in the superpoint growing environment. The runtime in the sequential superpoint growing environment is poor, as each union decision requires a neural network call. Hence, a further sub-research objective is to improve the runtime. An episodic environment was developed as a solution, only requiring one graph neural network call. Similarities between superpoints are estimated in this environment and passed to a union algorithm. The differences between two graph neural network architectures and two union algorithms were experimentally investigated. According to the results, calculating the superpoint similarities with a correlation of the embedded node features is more robust than the similarity estimation with a sigmoid activation function. The reward function, used in the DRL partition approaches, was realised by a matching procedure. As this function influences the partition quality, another sub-research objective is to investigate the differences between various match types. Matching functions from the literature were compared, and another match type was introduced. The usage of different match types in the learning process was experimentally evaluated. Although an agent gets more feedback with all match types, the best results (visual and in terms of the partition size) were achieved by only using first-order matches in the reward function. The synthetic scenes of the region growing approach lack realism as the lighting information is ignored, which can be important to train networks for the partition task. Therefore, a further sub-research objective is to develop a scene generator where the lighting is taken into account. After its development, the generated scenes were experimentally evaluated in a pre-training task. It turned out that the lighting information is important for a pre-training as larger accuracies were achieved. Furthermore, a faster convergence can be achieved with the pre-trained network instead of training a network on a target data set from scratch.
Another sub-research objective targets the development of a usable partition interface. In this context, the Blender add-on OpenXtract was developed, containing five open-source point cloud partition algorithms. The partition algorithms were extended by approximating geodesic distances so that the edges of meshes are used. An experiment has shown that the extended algorithms produce larger accuracies, which is considered an increase in effectiveness. Moreover, unstructured interviews revealed that OpenXtract can improve the effectiveness and efficiency of the partition creation.
"Jede Stadt hat ihre Mollerstadt" war ein geflügelter Satz im Forschungsprojekt "s:ne", dem Transferprojekt, aus dem diese Publikation hervorgegangen ist.
Tatsächlich haben viele deutsche Städte ähnliche Quartiere. Im Zweiten Weltkrieg zerbombt, wieder aufgebaut im Stil und mit den niedrigen Standards der Nachkriegszeit. Mit einer kleinteiligen Parzellenstruktur und heterogener Eigentümerschaft in einer innerstädtischen Eins-B-Lage - wo Wohnen anders als in der benachbarten "City" noch eine zentrale Rolle spielt.
Diese Quartiere sind in die Jahre gekommen. Sie haben aber eine besondere Bedeutung auch und gerade im Hinblick auf eine nachhaltige Stadtentwicklung im Sinne der Stadt der kurzen Wege.
Diese Arbeit verbindet durch die Einordnung in den Nachhaltigkeitswissenschaften als multidisziplinäres Wissenschaftsgebiet einen ingenieurstechnischen- und einen sozialwissenschaftlichen Teil. Im ingenieurstechnischen Teil konnten Indizien dafür gefunden werden, dass Rezyklatkunststoffe hinsichtlich deren mechanischen Eigenschaften in hochbelasteten Strukturbauteilen eingesetzt werden können und somit Neuwarekunststoffe substituieren. Dazu werden Neuwaren- und Rezyklatkunststoffe aus Polyprophylen mit 30 Gewichtsprozent Talkumfüllung systematisch untersucht. Dabei wird der Einfluss von Kerben, Bindenähten, Mittelspannung, Temperatur und Alterung auf die mechanischen Eigenschaften unter statischer und zyklischer Belastung untersucht. Begleitende analytische Untersuchungen beschreiben die molekularen und kristallinen Unterschiede von Neuwaren- und Rezyklatkunststoffen. Damit können Rückschlüsse auf die mechanischen Eigenschaften gezogen werden und lassen sich dadurch wissenschaftlich begründen.
Die ermittelten mechanischen Kennwerte unter statischer und zyklischer Belastung fließen in ein Kerbspannungskonzept nach dem höchst beanspruchten Werkstoffvolumen V80 und nach dem Spannungsgradienten χ* ein. Lokale Beanspruchungskennwerte werden nach dem in dieser Arbeit entwickelten Konzept der relativen inelastischen Dehnungen ermittelt.
Damit wird an einem Geräteträger einer Geschirrspülmaschine ein zyklischer Festigkeitsnachweis erbracht, dass der Geräteträger aus dem untersuchten Rezyklatmaterial die geforderte Lebensdauer ertragen kann. Begleitende numerische Berechnungen und Bauteilversuche unter Einsatzbedingungen validieren die Lebensdauerabschätzung.
Im sozialwissenschaftlichen Teil dieser Arbeit wird untersucht, wie Rezyklate aus ihrer technologischen Nische in eine breite Anwendung gelangen können. Dazu wird das Modell der Multi-Level Perspective nach Geels [Gee02] verwendet. Um Rezyklate aus der technologischen Nische zu heben, bedarf es einer Strategie von verschiedenen Akteuren aus unterschiedlichen Ebenen. Dabei soll die Strategie Faktoren ermitteln, die es ermöglichen, das Rezyklate ein eigenes Regime bilden können. Diese Strategie wird in leitfadengestützten Experteninterviews mit Akteurgruppen aus dem sozioökonomischen, -technischen und -politischen Bereich, erfragt. Dabei wird mit Hilfe einer inhaltlich strukturierenden qualitativen Inhaltsanalyse die Strategie abgeleitet, die Rezyklate verstärkter in technischen Anwendungen einsetzt und wie sich der Markt hierfür zukünftig weiterentwickeln muss.
Background
As the climate and environmental crises unfold, eco-anxiety, defined as anxiety about the crises’ devastating consequences for life on earth, affects mental health worldwide. Despite its importance, research on eco-anxiety is currently limited by a lack of validated assessment instruments available in different languages. Recently, Hogg and colleagues proposed a multidimensional approach to assess eco-anxiety. Here, we aim to translate the original English Hogg Eco-Anxiety Scale (HEAS) into German and to assess its reliability and validity in a German sample.
Methods
Following the TRAPD (translation, review, adjudication, pre-test, documentation) approach, we translated the original English scale into German. In total, 486 participants completed the German HEAS. We used Bayesian confirmatory factor analysis (CFA) to assess whether the four-factorial model of the original English version could be replicated in the German sample. Furthermore, associations with a variety of emotional reactions towards the climate crisis, general depression, anxiety, and stress were investigated.
Results
The German HEAS was internally consistent (Cronbach’s alphas 0.71–0.86) and the Bayesian CFA showed that model fit was best for the four-factorial model, comparable to the factorial structure of the original English scale (affective symptoms, rumination, behavioral symptoms, anxiety about personal impact). Weak to moderate associations were found with negative emotional reactions towards the climate crisis and with general depression, anxiety, and stress.
Discussion
Our results support the original four-factorial model of the scale and indicate that the German HEAS is a reliable and valid scale to assess eco-anxiety in German speaking populations.
Valid online inference is an important problem in contemporary multiple testing research,to which various solutions have been proposed recently. It is well-known that these existing methods can suffer from a significant loss of power if the null p-values are conservative. In this work, we extend the previously introduced methodology to obtain more powerful procedures for the case of super-uniformly distributed p-values. These types of p-values arise in important settings, e.g. when discrete hypothesis tests are performed or when the p-values are weighted. To this end, we introduce the method of super-uniformity reward (SUR) that incorporates information about the individual null cumulative distribution functions. Our approach yields several new 'rewarded' procedures that offer uniform power improvements over known procedures and come with mathematical guarantees for controlling online error criteria based either on the family-wise error rate (FWER) or the marginal false discovery rate (mFDR). We illustrate the benefit of super-uniform rewarding in real-data analyses and simulation studies. While discrete tests serve as our leading example, we also show how our method can be applied to weighted p-values.
Discrete uniform and homogeneous p-values often arise in applications with multiple testing. For example, this occurs in genome wide association studies whenever
a non-parametric one-sample (or two-sample) test is
applied throughout the gene loci. In this paper, we considermultiple comparison procedures for such scenarios
based on several existing estimators for the proportion
of true null hypotheses, 𝜋0, which take the discreteness
of the p-values into account. The theoretical guarantees
of the several approaches with respect to the estimation of 𝜋0 and the false discovery rate control are reviewed. The performance of the discrete procedures is investigated through intensive Monte Carlo simulations considering both independent and dependent p-values. The methods are applied to three real data sets for illustration
purposes too. Since the particular estimator of
𝜋0 used to compute the q-values may influence its performance, relative advantages and disadvantages of the reviewed procedures are discussed. Practical recommendations are given.
Several classical methods exist for controlling the false discovery exceedance (FDX) for large-scale multiple testing problems, among them the Lehmann-Romano procedure (Lehmann and Romano 2005) ([LR] below) and the Guo-Romano procedure (Guo and Romano 2007) ([GR] below). While these two procedures are the most prominent, they were
originally designed for homogeneous test statistics, that is, when the null distribution functions of the p-values Fi, 1 ≤ i ≤ m, are all equal. In many applications, however, the data are heterogeneous which leads to heterogeneous null distribution functions. Ignoring this heterogeneity induces a lack of power. In this paper, we develop three new procedures that incorporate the Fi’s, while maintaining rigorous FDX control. The heterogeneous version of [LR], denoted [HLR], is based on the arithmetic average of the Fi’s, while the heterogeneous version of [GR], denoted [HGR], is based on the geometric average of the Fi’s. We also introduce a procedure [PB], that is based on the Poisson-binomial distribution and that uniformly improves [HLR] and [HGR], at the price of a higher computational complexity. Perhaps surprisingly, this shows that, contrary to the known theory of false discovery rate (FDR) control under heterogeneity, the way to incorporate the Fi’s can be particularly simple in the case of FDX control, and does not require any further correction term. The performances of the new proposed procedures are illustrated by real and simulated data in two important heterogeneous settings: first, when the test statistics are continuous but
the p-values are weighted by some known independent weight vector, e.g., coming from co-data sets; second, when the test statistics are discretely distributed, as is the case for data representing frequencies or counts. Our new procedures are implemented in the R package FDX, see Junge and Döhler (2020).
This experimental study investigates readers’ perceived text quality and trust towards journalistic opinion pieces written by the language model GPT-3. GPT-3 is capable of automatically writing texts in human language and is often referred to as an artificial intelligence (AI). In a 2x2x2 within- subjects experimental design, 192 participants were presented with two randomly selected articles each for evaluation. The articles were varied with regard to the variables actual source, declared source (in each case human-written or AI-written) and the topic (1 & 2). Prior to the experimental design, participants indicated the extent to which they agreed with various statements about the trustworthiness of AI in order to capture their personal attitudes towards the topic.
The study found for one, that readers considered articles written by GPT-3 to be just as good as those written by human journalists. The AI-generated versions were rated slightly better in terms of text quality as well as the trust placed in the content. However, the effect was not statistically significant. For another, no negative effect on article perception was found for texts disclosed as AI-written. Articles declared as written by an AI were mostly rated equally well or again minimally better than texts declared as human, especially regarding trust. Only the readability was rated slightly worse for the case of declaring the AI as a source. Furthermore, a correlation was found between the participants’ personal attitudes towards the topic of AI and their perception of allegedly AI-written articles. For articles declared as AI-written, there are slight to moderate positive correlations of the personal attitudes towards AI with each quality rating criterion. Personal preconception thus plays a role in the perception of AI-written articles.
KI-basierte Tools wie ChatGPT bzw. GPT-4 verändern derzeit die Hochschullandschaft und vielerorts wird bereits über die Konsequenzen für die zukünftigen Lehr- und Prüfungsformen diskutiert. Um hier eine empirische Grundlage zu schaffen, ist eine deutschlandweite Befragung von Studierenden durchgeführt worden, in welcher das Nutzungsverhalten im Umgang mit KI-basierten Tools im Rahmen des Studiums und Alltags erfasst wurde. Hierbei wurden unter anderem diverse Funktionen der KIbasierten Tools identifiziert, die für die Studierenden als besonders wichtig eingeschätzt wurden. Das Ziel der quantitativen Befragung lag somit in der Erfassung davon, wie KI-Tools genutzt werden und welche Faktoren für die Nutzung maßgeblich sind.
Insgesamt haben sich deutschlandweit über 6300 Studierende an der anonymen Befragung beteiligt. Die Ergebnisse dieser quantitativen Analyse verdeutlichen, dass fast zwei Drittel der befragten Studierenden KI-basierte Tools im Rahmen des Studiums nutzen bzw. genutzt haben. Explizit nennen in diesem Kontext fast die Hälfte der befragten Studierenden ChatGPT bzw. GPT-4 als genutztes Tool. Am häufigsten nutzen Studierende der Ingenieurwissenschaften sowie Mathematik und Naturwissenschaften KI-basierte Tools.
Eine differenzierte Betrachtung des Nutzungsverhaltens verdeutlicht, dass die Studierenden KI-basierte Tools vielfältig einsetzen. Die Klärung von Verständnisfragen und Erläuterung fachspezifischer Konzepte zählen in diesem Kontext zu den relevantesten Nutzungsgründen.
Die Bekämpfung des Klimawandel erfordert eine grundlegende Veränderung des weltweiten Energiesystems, um den Ausstoß von klimaschädlichen Treibhausgasen zu reduzieren. Die Nutzung erneuerbarer Energien ermöglicht es, auf fossile Brennstoffe zu verzichten. Allerdings ist dafür im Vergleich zu fossilen Technologien ein erhöhter Einsatz mineralischer Rohstoffe erforderlich. Die globale Energiewende ist damit auch als ein Wandel des Energiesystems hin zu einem materialintensiveren System zu begreifen. Zunehmende Bemühungen zum Ausbau von erneuerbaren Energien und von Technologien zu deren Nutzung können daher zukünftig die Nachfrage nach mineralischen Rohstoffen stark ansteigen lassen. Vor diesem Hintergrund beschäftigt sich diese Arbeit mit der Einschätzung der potenziellen Bedarfssteigerungen und untersucht, welche möglichen zukünftigen Auswirkungen mit Fokus auf den Aspekten der Versorgungssituation und des Energiebedarfes sich dadurch ergeben können. Zusätzlich wird untersucht, ob die Energiewende von ihren eigenen Auswirkungen rückwirkend beeinflusst oder behindert wird und ob sich limitierende Faktoren identifizieren lassen, die die Transformation und insbesondere ihre Geschwindigkeit beeinträchtigen. In dieser Arbeit wird die Thematik beispielhaft anhand der mineralischen Rohstoffe Kupfer und Lithium zunächst separat untersucht, bevor die Ergebnisse anschließend in einen gemeinsamen Kontext gebracht werden. Die Arbeit bildet insgesamt eine breite und aktuelle Wissenssammlung über die Rolle von Rohstoffen in der Energiewende und verwendet insbesondere das Mittel der Metastudie, um fundierte Prognosen über zukünftige Entwicklungen bis zum Jahre 2050 abzuleiten. Die wichtigsten Erkenntnisse dieser Masterarbeit lassen sich in Form der folgenden Kernaussagen zusammenfassen:
• Die Geschwindigkeit der globalen Energiewende kann durch Knappheit und durch hohe Preise von Kupfer und Lithium negativ beeinflusst werden. Dies kann als Rückwirkung des globalen Marktes angesehen werden, der durch eine schnelle Transformation angespannt wird.
• Die Bedarfe nach Kupfer und Lithium steigen im Zuge der Energiewende voraussichtlich stark an, weswegen eine Unterdeckung der Nachfrage eintreten kann, da die Angebotskapazitäten gegebenenfalls an die Grenzen des realisierbaren Wachstums gelangen. Für beide betrachtete Rohstoffe stellt die Elektromobilität einen der größten Bedarfstreiber dar.
• Der entscheidendste limitierende Faktor für das Angebotswachstum beider betrachteter Rohstoffe ist die Geschwindigkeit des Ausbaus der primären Extraktionskapazitäten, da die Primärproduktion auch zukünftig weiterhin die wichtigste Versorgungsroute darstellen wird.
• Die Verfügbarkeit von Lithium stellt aufgrund fehlender absehbarer Substitutionsmöglichkeiten einen limitierenden Faktor für den Ausbau der Elektromobilität dar und könnte damit auch dämpfend auf die Energietransformation einwirken. Als Ergänzung sollten daher lithiumfreie Batterie- und Speichertechnologien verstärkt in Betracht gezogen werden.
• Erschöpfungserscheinungen der Erzvorkommen ergaben sich für Kupfer als die relevanteste Rückwirkung einer intensiven Förderung. Die schon lange Zeit praktizierte industrielle Gewinnung von Kupfer führt durch eine sinkende Erzqualität zu einem überproportional ansteigenden Aufwand bei der primären Förderung aus Minen. Dieser steigende Aufwand wirkt dämpfend auf das Angebotswachstum und erhöht den Energiebedarf der Kupferbereitstellung. Dies wiederum verschlechtert die Energiebilanz kupferhaltiger Technologien zunehmend.
• Das Innovationspotenzial zur Angebotssteigerung und Senkung des Energiebedarfes für Kupfer ist weitestgehend ausgeschöpft. Für den erst seit vergleichsweise kurzer Zeit in größerem Ausmaß genutzten Rohstoff Lithium besteht hingegen noch viel Innovationspotenzial in allen Bereichen.
Die hessische Landesregierung hat sich dazu verpflichtet, zukünftig den Endenergieverbrauch für Wärme vollständig aus erneuerbaren Energienquellen zu beziehen. Dies erfordert eine Umstellung von fossilen Energieträgern hin zu Heizkonzepten, welche erneuerbare Energiequellen nutzen. Durch den Einsatz einer Wärmepumpe besteht die Möglichkeit Wärmequellen zu erschließen, wordurch fossile Energieträger in jedem Fall keine Verwendung mehr finden.
In dieser Thesis wird die Machbarkeit des Einsatzes von Wärmepumpen am Campus Schöfferstraße der Hochschule Darmstadt geprüft.
Des Weiteren soll diese Arbeit aufzeigen, welche Maßnahmen für den Einsatz von Wärmepumpen erforderlich sind und welche Risiken bzw. Schwierigkeiten sich daraus ergeben. Auf dieser Grundlage ließen sich Investitionskosten abschätzen. Außerdem soll vermittelt werden, welche Faktoren einen Einfluss auf den Einsatz von Wärmepumpen haben.
Im Hinblick auf die Frage, ob der Campus Schöfferstraße an ein Fernwärmenetz der Stadt Darmstadt angeschlossen werden soll oder sich autark mit Wärmepumpen versorgt ist eine Machbarkeitsstudie für den Einsatz von Wärmepumpen essentiell.
Abstract
Background
The EU chemicals regulation “Registration, Evaluation, Authorisation and Restriction of Chemicals” (REACH) aims to reduce the usage of substances of very high concern (SVHCs) by firms. Therefore, a consumer right-to-know about SVHCs in articles is intended to create market-based incentives. However, awareness of the right-to-know among EU citizens is low. Moreover, the response window of 45 days afforded to suppliers impedes immediate, informed decisions by consumers. Consequently, despite being in effect for more than 10 years, only few consumer send requests. Civil society actors have developed smartphone applications reducing information search costs, allowing users to send right-to-know requests upon scanning an article’s barcode. Answers are stored in a database and made available to the public immediately. This paper assesses to which extent smartphone tools contribute to an increased use of the right-to-know by undertaking a case study of the application “ToxFox” by the German non-profit organisation Bund für Umwelt und Naturschutz Deutschland (BUND).
Results
An analysis of the data from the BUND database for the period 2016 to 2018 reveals that about 20 thousand users have sent almost 49 thousand requests. This has led to more than 9 thousand database entries, including 189 articles which contain SVHCs above the legal threshold. The data also indicate that receiving information on requested articles encourages further use of the application. Many suppliers accept the application and pro-actively provide information on articles without SVHCs above the threshold. However, most consumers use the application only for a short time, and suppliers are struggling to reply to right-to-know requests.
Conclusion
Evaluating the results, the study identifies options to enhance the application’s design in terms of user motivation and legal certainty, and to enhance the framework governing "barcode" assignments to articles with a view to better contributing to transparency. As for policy implications, a lack of consumer requests can in part be traced back to design flaws of the right-to-know and a lack of implementation and enforcement of REACH. In addition, suppliers have to increase their supply chain communication efforts to make sure they are in a position to properly answer consumer requests. We recommend several policy options addressing these and additional aspects, thus contributing to the legislative review of Art. 33 REACH.
Rezension „Data Governance“
(2020)
Due to the increasing demand for higher bandwidth in modern communication systems, conventional networks are continuously expanded with new technologies to improve coverage. Free space optical communications (FSOC) shows some significant advantages concerning system setup time in comparison with the classical fiber optical systems on one hand, substantial spectral bandwidth and performances in comparison with the wireless systems under certain conditions on the other hand. This makes this technology not only a reasonable extension for metropolitan area networks but also provides the capability to set up a network after an outage in case of natural disaster quickly. But transmitting data by using FSOC involves some limiting factors that have to be considered prior to each installation. Since the atmospheric channel is not static, the influence of changing weather conditions or industrial smog have a significant impact on the available bitrate. A simulation platform is developed and presented in this paper for investigation of FSOC considering these circumstances. Regarding the atmospheric channel, turbulence, distance-dependent beam divergence, and applied modulation schemes, a general overview of the capabilities is presented and discussed. The insight of this paper should help to make a decision under which preconditions either the FSOC provides a meaningful application possibility, or the limiting factors become too crucial and other technologies must be considered.
The relevance of Machine Intelligence, a.k.a. Artificial Intelligence (AI), is undisputed at the present time. This is not only due to AI successes in research but, more prominently, its use in day-to-day practice. In 2014, we started a series of annual workshops at the Leibniz Zentrum für Informatik, Schloss Dagstuhl, Germany, initially focussing on Corporate Semantic Web and later widening the scope to Applied Machine Intelligence. This article presents a number of AI applications from various application domains, including medicine, industrial manufacturing and the insurance sector. Best practices, current trends, possibilities and limitations of new AI approaches for developing AI applications are also presented. Focus is put on the areas of natural language processing, ontologies and machine learning. The article concludes with a summary and outlook.
To study combustion fundamentals of complex fuels under well-defined boundary conditions, a novel Temperature Controlled Jet Burner (TCJB) system is designed that can stabilise both gaseous or pre-vaporised liquid fuels. In a first experimental exploratory study, piloted turbulent jet flames of pre-vaporised methanol, ethanol, 2-propanol and 2-butanol mixtures are compared to methane/air as a reference fuel. Complementary one-dimensional laminar flame calculations are used to provide flame parameters for comparison. Blow-off and flame length as global flame characteristics are measured over a wide range of equivalence ratios. For fuel rich conditions, blow-off limits correlate well with extinction strain rate calculations. Differing flame lengths from lean to rich conditions are explained partly by different flame wrinkling that is assessed using planar laser-induced fluorescence imaging of the hydroxyl radical (OH-PLIF). A study of Lewis-number effects indicates that they have substantial influence on flame wrinkling. Lean alcohol/air flames, opposed to methane/air, have a Lewis-number greater than unity. This impedes curvature development, which promotes relatively large flame lengths. In contrast, across stoichiometric conditions, all alcohol/air mixture Lewis-numbers decrease significantly. At such conditions, alcohol/air flames show alike or even larger wrinkling compared to methane/air flames. However, quantitatively, the differences in flame length and wrinkling observed among the flames can neither be explained alone by Lewis-number differences, nor other global mixture parameters available from 1D laminar flame calculations. This study shall therefore emphasise the need for more detailed experimental analyses of the full thermochemical state of laminar and turbulent flames fuelled with complex fuels.
Signals and images with discontinuities appear in many problems in such diverse areas as biology, medicine, mechanics and electrical engineering. The concrete data are often discrete, indirect and noisy measurements of some quantities describing the signal under consideration. A frequent task is to find the segments of the signal or image which corresponds to finding the discontinuities or jumps in the data. Methods based on minimizing the piecewise constant Mumford–Shah functional—whose discretized version is known as Potts energy—are advantageous in this scenario, in particular, in connection with segmentation. However, due to their non-convexity, minimization of such energies is challenging. In this paper, we propose a new iterative minimization strategy for the multivariate Potts energy dealing with indirect, noisy measurements. We provide a convergence analysis and underpin our findings with numerical experiments.
The link performance of free space optical communications (FSOC) and deep space optical communications (DSOC) are investigated by considering two scenarios in space communications, for example, for the downlink and uplink between the earth ground stations and the near earth geostationary (GEO) satellites, and between the earth and spacecraft with large distance of 1 astronomical unit (AU) to the earth. Generally a distance larger than 0.01 AU or approximately 1,500,000 km from Earth is considered as deep space. In these theoretical investigations, different realistic system parameters for the optical lasers, transmitters, receivers, avalanche photodiodes (APDs), optical telescopes, atmospheric disturbances like scintillation and absorption are considered. The simulation results are compared with existing project data and valuable ESA experimental results to verify and improve the simulation models. The comparison in this paper shows that the simulation models for the link budget and the scintillation estimation are feasible for the investigations of FSOC and DSOC, and can be used to investigate improved design and implementation of DSOC projects for planned long-term and medium-term space missions.
During the course of a typical deep space mission like the Mars Earth mission, there exist a wide range of operating points, due to the different changes in geometry that consequently cause different link budgets in terms of received signal and noise power. These changes include distance range, Sun-Earth-Probe angle, zenith angle and atmospheric conditions. The different operating points, with different losses (background noise, pointing losses and atmospheric losses), lead to different capacities and data rates over the course of a typical deep space mission. Consequently, different engineering parameters are adjusted and optimized to combat some of these varying losses in order to get acceptable data rates and bit error probabilities. This is a useful reason to analyze and simulate various operating conditions that occur with the varying spatial orbital time periods of the resulting received signal power level, noise power level, capacity, data rates and bit error probabilities. This paper details results of simulations of typical deep space optical communication link operation.