Refine
Year of publication
Document Type
- Conference Proceeding (555)
- Contribution to a Periodical (409)
- Article (peer reviewed) (369)
- Part of a Book (168)
- Other (112)
- Book (81)
- Report (71)
- Doctoral Thesis (24)
- Working Paper (20)
- Journal (Komplette Ausgabe eines Zeitschriftenheftes, eines Zeitschriftenjahrgangs oder einer Reihe) (6)
Language
- German (1047)
- English (767)
- Multiple languages (3)
Is part of the Bibliography
- no (1817)
Keywords
- Schallmesstechnik (85)
- Holzbau (69)
- FVK (59)
- Trockenbau (41)
- Kita-Management (29)
- ICF (20)
- Speech Recognition (18)
- Brandschutz (17)
- Bioökonomie (16)
- Rehabilitation (16)
Institute
- Fakultät für Angewandte Gesundheits- und Sozialwissenschaften (409)
- Fakultät für Angewandte Natur- und Geisteswissenschaften (234)
- Fakultät für Ingenieurwissenschaften (227)
- Fakultät für Holztechnik und Bau (206)
- Fakultät für Informatik (148)
- Fakultät für Betriebswirtschaft (107)
- Zentrum für Forschung, Entwicklung und Transfer (77)
- Fakultät für Chemische Technologie und Wirtschaft (44)
- Fakultät für Sozialwissenschaften (37)
- Forschung und Entwicklung (31)
Der Beitrag zeigt im Prinzip und an konkreten Beispielen auf, dass Anforderungen an Bauprodukte selten pauschal und einheitlich für Bauwerke bestehen, sondern in der großen Mehrheit aller Fälle für jede konkrete Verwendungssituation in einem spezifischen Bauwerk im Rahmen des Planungsprozesses entstehen. Häufig kann die gleiche Bauwerksanforderung mit verschiedenen Lösungswegen erreicht werden, die jeweils unterschiedliche Anforderungen an die verwendeten Bauprodukte bedeuten.
Unter dem Titel „Gamification trifft Hybride Lehre“ wurde im Sommersemester 2023 ein Lehrprojekt im Fach Statistik des Studiengangs Wirtschaftsmathematik-Aktuarwissenschaften der Technischen Hochschule Rosenheim durchgeführt. Unterschiedliche hybrid-synchrone Lernsettings wurden in einem technisch entsprechend ausgestatteten Lehr-Experimentierraum mit den Studierenden erprobt. Als Ansatz zur Vor- und Nachbereitung der hybriden Lehre kam Gamification bzw. Gameful Motivation zum Einsatz.
Im Lehrprojektteam mit Stefanie Neumaier wurde die hybride Lehre mittels der sog. EMPAMOS-Methode vorbereitet und reflektiert. Dabei ging es nicht darum, den spielfremden Kontext (die hybride Lehre) in ein Spiel zu verwandeln, sondern Spielelemente zur Zielerreichung einzusetzen. Die Auseinandersetzung mit hybriden Formaten wird dabei für Studierende und Lehrende als ein Zukunftsthema gesehen, nicht zuletzt für den Erwerb von Future Skills in einer Arbeitswelt, in der sich hybride Kollaboration etabliert.
Der Beitrag behandelt den Umgang mit sogenannten "nichttragenden Brandwänden" im deutschen Bauordnungsrecht. Mit der Umstellung von der früheren Bauregelliste (BRL) auf die (Muster) Verwaltungsvorschrift Technische Baubestimmungen (MVV TB) wurde diesen die Grundlage für die Erteilung allgemeiner bauaufsichtlicher Prüfzeugnisse und die erforderliche Zuordnung der bauaufsichtlichen Anforderungen zu Leistungsangaben in Anwendbarkeitsnachweisen entzogen. Es wird die Möglichkeit der "Verzichtserklärung" der obersten Bauaufsichtsbehörden vorgestellt.
Wie wird Vaterschaft ermittelt? Das Bundesverfassungsgerichtsurteil zu so genannten heimlichen Vaterschaftstests hat einen Anspruch auf die genetische Abstammungsklärung verankert. Ziel dieser Klärung ist es für Väter, festzustellen, ob ihr rechtliches Kind auch ihr biologisches Kind ist. Dem Urteil folgte ein Gesetz zur Klärung der Abstammung. Das neue Gesetz schafft den „stets sicheren“ Vater. Wie ist es vor diesem Hintergrund um die Sicherheit des Kindes bestellt? In welchem Verhältnis stehen Vaterrechte und Kinderrechte? Diesen Fragen geht Sabina Schutter anhand eines diskursanalytischen Verfahrens nach.
Your one-stop solution for getting up and running with SQL Server 2000
Develop and deploy large-scale applications with SQL Server 2000. In this book, database expert Dusan Petkovic explains how to use all the features of this powerful, scaleable relational database management system. You'll learn to configure SQL Server 2000, use T-SQL, execute efficient database queries, and enable secure transactions. Troubleshooting, data warehousing, and optimization are also covered. You'll find complete details on Microsoft Analysis Services, managing multiple servers, maximizing uptime, and performance tuning. With SQL Server 2000: A Beginner's Guide, your mission-critical database applications will be up-and-running in no time!
Learn to:
Setup, manage, and customize SQL Server 2000
Administer multiple instances of SQL Server 2000 using the SQL Server Enterprise Manager
Perform simple and complex database queries
Tune the performance of your SELECT statements
Use SQL extensions, stored procedures, transactions, views, and triggers
Implement the SQL Server 2000 security model
Find and fix database problems by capturing and replaying server activity
Import, export, and transform data types using DTS
Construct data warehouses to collect, organize, and distribute information efficiently
The main focus of this work is the development of new methods for the self-calibration of a rigid stereo camera system. However, many of the algorithms introduced here have a wider impact, particularly in robot hand-eye calibration with all its different areas of application. Stereo self-calibration refers to the computation of the intrinsic and extrinsic parameters of a stereo rig using neither a priori knowledge on the movement of the rig nor on the geometry of the observed scene.
The stereo parameters obtained by self-calibration, namely rotation and translation from left to right camera, are used for computing depth maps for both images, which are applied for rendering correctly occluded virtual objects into a real scene (Augmented Reality).
The proposed methods were evaluated on real and synthetic data and compared to algorithms from the literature. In addition to a stereo rig, an optical tracking system with a camera mounted on an endoscope was calibrated without a calibration pattern using the proposed extended hand-eye calibration algorithm.
The self-calibration methods developed in this work have a number of features, which make them easily applicable in practice: They rely on temporal feature tracking only, as this monocular tracking in a continuous image sequence is much easier than left-to-right tracking when the camera parameters are still unknown.
Intrinsic and extrinsic camera parameters are computed during the self-calibration process, i.e., no calibration pattern is required. The proposed stereo self-calibration approach can also be used for extended hand-eye calibration, where the eye poses are obtained by structure-from-motion rather than from a calibration pattern.
An inherent problem to hand-eye calibration is that it requires at least two general movements of the cameras in order to compute the rigid transformation.
If the motion is not general enough, only a part of the parameters can be obtained, which would not be sufficient for computing depth maps. Therefore, a main part of this work discusses methods for data selection that increase the robustness of hand-eye calibration. Different new approaches are shown, the most successful ones being based on vector quantization.
The data selection algorithms developed in this work can not only be used for stereo self-calibration, but also for classic robot hand-eye calibration, and they are independent of the actually used hand-eye calibration algorithm.
Deploy and manage SQL Server 2005 with easeLearn to use all the powerful features available in SQL Server 2005 from this straightforward, hands-on guide.
Set up SQL Server 2005, automate system administration tasks, execute simple and complex database queries, and use the robust analysis, business intelligence, and reporting tools. Troubleshooting, data partitioning, replication, and query optimization are also covered.
With SQL Server 2005: A Beginner's Guide, you'll be able to set up a secure, reliable, and productive data management platform in no time.
Essential Skills for Database Professionals
- Install and customize SQL Server 2005
- Create, alter, and remove database objects with Transact-SQL statements
- Use SQL Server as a native XML database system - Tune your database system for optimal performance
- Use the new SQL Server Management Studio tool for executing and analyzing ad hoc queries
- Retrieve data from more than one source using join operations and SELECT statements
- Secure your database using two different authentication modes--Windows and mixed
- Restore databases using transaction logs and backup and recovery methods
- Streamline system administration tasks using the SQL Server Agent service tool
- Analyze and manage information stored in a data warehouse with Microsoft Analysis Services
Background
To date, targeted tyrosine kinase inhibitors have been approved for FGFR2 and FGFR3 fusions (pemigatinib and erdafitinib, respectively), but the importance of FGFR2 mutations for transformation activity and as a druggable gene variant with response to different FGFR inhibitors is poorly understood. FGFR2 inhibitors present a mainstay of treatment for locally advanced or metastatic intrahepatic cholangiocellular carcinoma (iCCA).
Methods
A 74-year-old male was diagnosed with iCCA in liver segments seven and eight with infiltration of the hepatic veins and inferior vena cava revealed a C382R mutation of the intramembrane domain of FGRR2 receptor. We performed an in-silico study to understand the potential mode-of-action of the mutant FGFR2 targets. Based on experimentally determined structures we then used a structure generated by AlphaFold2 as the variation in question is located at a position not determined well in the experiments. This revealed that the C382R mutation is located in the trans-membranal domain at a position crucial for signal transduction, both for activation and inhibition of downstream-signaling. The Molecular Tumor Board decided to start the treatment with 13.5 mg pemigatinib once daily for 14 days, followed by 7 days of free therapy interval resulting in a sustained partial response. The patient continues to be treated of 13.5 mg as described above.
Results
In our case report, we were able to show that the patient in whom an C382R mutation was detected responded to the therapy with pemigatinib. This shows that real-world scenarios differ from the data of the approval studies, thereby illustrating how complex data on patients with FGFR mutations is. One of the main problems of large approval studies is that the functionality of the respective alterations is often disregarded.
Conclusions
Our results suggest that respective mutation may be successfully targeted by FGFR-selective tyrosine-kinase inhibitors, demonstrating the importance of the functional characterization of mutations.
no conflicts of interest.
Get Started on Microsoft SQL Server 2008 in No TimeLearn to use all of the powerful features available in SQL Server 2008 quickly and easily.
Microsoft SQL Server 2008: A Beginner's Guide explains the fundamentals of each topic alongside examples and tutorials that walk you through real-world database tasks.
Install SQL Server 2008, construct high-performance databases, use powerful Transact-SQL statements, create stored procedures and triggers, and execute simple and complex database queries. Performance tuning, Database Engine security, Business Intelligence, and XML are also covered.
Set up, configure, and maintain SQL Server 2008
Build and manage database objects using Transact-SQL statements Create stored procedures and user-defined functionsOptimize database performance, availability, and reliability
Implement solid security using authentication, encryption, and authorization Automate tasks using SQL Server Agent
Create reliable data backups and perform flawless system restores
Use all-new SQL Server 2008 Business Intelligence, development, and administration toolsLearn in detail the SQL Server XML technology (SQLXML)
Get Started on Microsoft SQL Server 2012 in No TimeLearn to use all of the powerful features available in SQL Server 2012 quickly and easily.
Microsoft SQL Server 2012: A Beginner's Guide explains the fundamentals of each topic alongside examples and tutorials that walk you through real-world database tasks.
Install SQL Server 2012, construct high-performance databases, use powerful Transact-SQL statements, create stored procedures and triggers, and execute simple and complex database queries.
Performance tuning, Database Engine security, Business Intelligence, and XML are also covered.
Set up, configure, and maintain SQL Server 20012 Build and manage database objects using Transact-SQL statements Create stored procedures and user-defined functionsOptimize database performance, availability, and reliabilityImplement solid security using authentication, encryption, and authorization
Automate tasks using SQL Server AgentCreate reliable data backups and perform flawless system restoresUse all-new SQL Server 2012 Business Intelligence, development, and administration toolsLearn in detail the SQL Server XML technology (SQLXML)
Get up and running on Microsoft SQL Server 2016 in no time with help from this thoroughly revised, practical resource. The book offers thorough coverage of SQL management and development and features full details on the newest business intelligence, reporting, and security features.
Filled with new real-world examples and hands-on exercises, Microsoft SQL Server 2016: A Beginner's Guide, Sixth Edition, starts by explaining fundamental relational database system concepts.
From there, you will learn how to write Transact-SQL statements, execute simple and complex database queries, handle system administration and security, and use the powerful analysis and BI tools. XML, spatial data, and full-text search are also covered in this step-by-step tutorial.
· Revised from the ground up to cover the latest version of SQL Server
· Ideal both as a self-study guide and a classroom textbook
· Written by a prominent professor and best-selling author
Recent development of GaN power transistors with blocking voltages up to 650V enables novel power electronics applications with outstanding performance in high-frequency operation. This paper demonstrates a class E power amplifier with 13.56MHz switching frequency for inductively coupled DC power supplies. Continuous wave output power up to 200W is achieved with 95% Power Added Efficiency (PAE).
Objective. The authors performed a methodological comparison of the usual standard gamble with methods that could also be used in mailed questionnaires.Methods.Ninety-two diabetic patients valued diabetes-related health states twice. In face-to-face interviews, the authors used an iterative standard gamble (ISG) in which the probabilities were varied in a ping-pong manner and a self-completion method (SC) with top-down titration as search procedure (SC-TD) in 2 independent subsamples of 46 patients. Three months later, all patients received a mailed questionnaire in which the authors used the self-completion method with bottom-up (SCBU) and SC-TD as search procedures.Results.ISG and SCTD showed feasibility and consistency in the interviews. The ISG resulted in significantly higher utilities than the SC-TD. Two thirds of the mailed questionnaires provided useful results indicating some problems of feasibility. Utilities measured by SC-BU and SC-TD did not differ significantly showing procedural invariance. Further, patients indicated ambivalence when given the choice not to definitely state their preferences.Conclusions.The results show that different strategies to collect standard gamble utilities can yield different results. Compared with the usually applied ISG, the SC method is feasible in interviews and provides a consistent alternative that is less costly when used in mailed questionnaires, although its practicability has to be improved in this later setting.
The number of embedded systems in our daily lives that are distributed, hidden, and ubiquitous continues to increase. Many of them are safety-critical. To provide additional or better functionalities, they are becoming more and more complex, which makes it difficult to guarantee safety. It is undisputed that safety must be considered before the start of development, continue until decommissioning, and is particularly important during the design of the system and software architecture. An architecture must be able to avoid, detect, or mitigate all dangerous failures to a sufficient degree. For this purpose, the architectural design must be guided and verified by safety analyses. However, state-of-the-art component-oriented or model-based architectural design approaches use different levels of abstraction to handle complexity. So, safety analyses must also be applied on different levels of abstraction, and it must be checked and guaranteed that they are consistent with each other, which is not supported by standard safety analyses. In this paper, we present a consistency check for CFTs that automatically detects commonalities and inconsistencies between fault trees of different levels of abstraction. This facilitates the application of safety analyses in top-down architectural designs and reduces effort.
In safety analysis for safety-critical embedded systems, methods such as FMEA and fault trees (FT) are strongly established in practice. However, the current shift towards model-based development has resulted in various new safety analysis methods, such as Component Integrated Fault Trees (CFT). Industry demands to know the benefits of these new methods. To compare CFT to FT, we conducted a controlled experiment in which 18 participants from industry and academia had to apply each method to safety modeling tasks from the avionics domain.
Although the analysis of the solutions showed that the use of CFT did not yield a significantly different number of correct or incorrect solutions, the participants subjectively rated the modeling capacities of CFT significantly higher in terms of model consistency, clarity, and maintainability. The results are promising for the potential of CFT as a model-based approach.
Purpose: This study aimed to gain knowledge about environmental factors (EFs) that impact work and social life participation of people with multiple sclerosis (MS) in Austria and Switzerland to extend the knowledge of participation and to identify key areas for measuring participation.
Method: A three-round Delphi study was conducted defining patients as experts. In the 1st round, qualitative data was gathered through questionnaires, analyzed with content analysis, and factors were assigned to EFs as classified in the ICF. In the 2nd and 3rd round, experts judged EFs according to its relevance to obtain consensus (cut-off 75%). Categories were ranked on a scale from mostly important to important.
Results: One hundred and twelve Austrian and 109 Swiss experts were recruited. The content analysis revealed 768 EFs. The study resulted in a list of 176 consensus factors for Austria and 177 Switzerland. Five categories revealed to be highly important, 12 moderately important, 6 fairly important, and 10 important.
Conclusions: This study indicates that participation in work or social life is influenced by physical, social, attitudinal, and policy factors. Consensus factors afford insights into areas for consideration in the development of participation outcome measurements and support a comprehensive and inclusive rehabilitation approach.
We present inductive power transfer (IPT) with a half-bridge converter based on differential relaxation self-oscillation. The oscillation dynamics of the converter automatically adapts to variation in the inductive coupling link and changes in the load, respectively. Design equations based on theoretical circuit analysis reveals a high power transfer efficiency (>90%) for alteration of coupling distance in the strong coupling regime. A prototype system employing GaN-HEMTs confirms the theoretically predicted characteristics of the proposed circuit.
We present a novel differential relaxation oscillator for inductive power transfer. The proposed oscillator offers an automatic adaption of the oscillation period to a change in the power link impedance, which ensures a high efficient operation for variable loads and coupling factors. A detailed circuit model of the self-oscillation system is provided with analytical design equations. We find a large self-oscillation range for variable coupling factors and loads. The theoretical results are confirmed by circuit simulation and measurements carried out on a low-power demonstrator system. We demonstrate non-resonant and robust self-adaption of the relaxation oscillator to coupling factors ranging from k=0.6-0.9.
A quantitative physical model for potential induced degradation of the shunting type (PID-s) in solar modules is introduced. Based on a drift and diffusion approach for sodium ions and atoms, it gives insight into the kinetics of degradation and the corresponding regeneration. A simple drift/source term is used to describe the time-dependent flux of Na-ions towards stacking faults at the surface of the solar cell. The assumed transport mechanism for Na+ ions through the SiN-layer uses the modified Stern-Eyring rate theory but our approach can also be adapted to other mechanisms. Several PID-s and regeneration curves of one-cell solar modules at T = 49°C and T = 90°C with 1000V potential difference between modul frame and cell were measured and least-square fits of the in-situ measured parallel resistance Rsh to the model were performed giving very good accordance.
Based on a few measurements, the model can predict PID-s and regeneration characteristics of solar modules under different conditions.
Within the scope of the recently finished cooperate research project "VibWood" at Technical University of Munich and Rosenheim University of Applied Sciences, fully parametrized numerical models for lightweight wooden floor constructions have been developed and calibrated by experimental modal analyses.
Based on a vast parametric study including floating floors and suspended ceilings a database of narrowband sound power levels in the frequency range up to 125 Hz for a wide range of floor dimensions has been set up.
A procedure will be presented on how to process the numerical data and derive single number values that allow a comparison to the standardized rating of impact sound insulation according to DIN EN ISO 717-2.
The derived data is implemented into a graphical user interface (GUI). This GUI is available for free. The database is open for additional geometries and will be enhanced by different constructions in the near future.
Safety assurance is a major challenge in the design of modern embedded systems that has become increasingly difficult in recent years. Growing system sizes and the rise of Cyber-Physical systems confront safety engineers with large sets of configurations to be analyzed. Current approaches are usually carried out at design time and do not address the need for automated assessments in the field. With Component Fault Trees (CFTs) there exists a component-based methodology that enables an efficient modular composition of safety artifacts. The combined model is a system-level CFT that can be analyzed by means of popular Fault Tree Analysis techniques that are widely accepted in the industry. However, when composing models, their interfacing elements must be connected manually which impedes the automation of the procedure. In this work, we introduce the notion of flow types that represent a particular kind of component interaction and define a taxonomy of related failure behavior. By annotating CFTs with types, a machine-readable vocabulary is provided that allows for an automated interconnection of their interfaces. This way, the automatic composition of models according to system architecture is enabled, allowing for automated safety assessments on system-level. We demonstrate the feasibility of our approach using an example ethylene vaporization unit.
In this research we present a short distance capacitively coupled wireless power transfer (WPT) system with a self-oscillating half-bridge converter, which uses a positive feedback signal from the WPT system. By this novel implementation we find additional features of the system compared to linear amplifier based systems. With the DC bias voltage supplying the power converter, we achieve a controllable output power, tunable self-oscillation frequency and extension of the self-oscillation range. We present the analytical expressions for the converter waveforms. The theoretical analysis is confirmed by circuit simulation results and measurements on a prototype featuring a half-bridge converter with Gallium-Nitide (GaN) HEMTs. We achieve a system efficiency of 93% at a load power of 83W and 150kHz switching frequency.
We introduce a model for extractive meeting summarization based on the hypothesis that utterances convey bits of information, or concepts. Using keyphrases as concepts weighted by frequency, and an integer linear program to determine the best set of utterances, that is, covering as many concepts as possible while satisfying a length constraint, we achieve ROUGE scores at least as good as a ROUGE-based oracle derived from human summaries. This brings us to a critical discussion of ROUGE and the future of extractive meeting summarization.
This paper presents an unsupervised, graph based approach for extractive summarization of meetings. Graph based methods such as TextRank have been used for sentence extraction from news articles. These methods model text as a graph with sentences as nodes and edges based on word overlap. A sentence node is then ranked according to its similarity with other nodes. The spontaneous speech in meetings leads to incomplete, informed sentences with high redundancy and calls for additional measures to extract relevant sentences. We propose an extension of the TextRank algorithm that clusters the meeting utterances and uses these clusters to construct the graph. We evaluate this method on the AM I meeting corpus and show a significant improvement over TextRank and other baseline methods.
The variation of the contact impedance of adhesive conductive EMI shielding tapes under high temperature storage (HTS) of 110degC is investigated. The adhesive tapes under test are implemented as signal return paths in microstrip lines. Changes in the ground plane impedance by adhesive deterioration are analyzed by S-parameter measurements from 1MHz to 3GHz. A circuit model for contact impedance degradation is proposed and applied in simulation of board level shielding. Contact resistance and contact capacitance are found to increase significantly after 2400h, causing a declined magnetic field shielding effectiveness by 30 dB below 500 MHz.
A high-order finite element model for vibration analysis of cross-laminated timber assemblies
(2017)
The vibration behavior of cross-laminated timber components in the low-frequency range can be predicted with high accuracy by the finite element method. However, the modeling of assembled cross-laminated timber components has been studied only scarcely. The three-dimensional p-version of the finite element method, which is characterized by hierarchic high-order shape functions, is well suited to consider coupling and support conditions. Furthermore, a small number of degrees of freedom can be obtained in case of thin-walled structures using p-elements with high aspect ratios and anisotropic ansatz spaces. In this article, a model for cross-laminated timber assemblies made of volumetric high-order finite elements is presented. Two representative types of connections are investigated, one with an elastomer between the cross-laminated timber components and the other without. The model is validated, and suitable ranges for the stiffness parameters of the finite elements which represent the junctions are identified.
Rooted in multi-document summarization, maximum marginal relevance (MMR) is a widely used algorithm for meeting summarization (MS). A major problem in extractive MS using MMR is finding a proper query: the centroid based query which is commonly used in the absence of a manually specified query, can not significantly outperform a simple baseline system. We introduce a simple yet robust algorithm to automatically extract keyphrases (KP) from a meeting which can then be used as a query in the MMR algorithm. We show that the KP based system significantly outperforms both baseline and centroid based systems. As human refined KPs show even better summarization performance, we outline how to integrate the KP approach into a graphical user interface allowing interactive summarization to match the user's needs in terms of summary length and topic focus.
A gate drive circuit for gallium nitride (GaN) enhancement mode (e-mode) transistors is presented, which avoids parasitic turn-on of the power devices in the halfbridge configuration. New e-mode GaN devices turn on at very low threshold voltages between 1V and 2V. This makes the transistors highly sensitive to spurious turn-on and thus reduces the required safety margin of the gate drive signals. To avoid this parasitic turn-on, a very low gate loop impedance is required. This prevents the halfbridge against bridge shorts during the switching events and guarantees stable gate drive control with increased switching efficiency. The new gate drive circuit is developed in a SPICE simulation environment and verified in a prototype setup by a double pulse test. The simulation matches very well with the experimental result and demonstrates the suppression of parasitic semiconductor turn-on with the proposed gate drive. Furthermore the dissipated switching energy is reduced, compared to a standard gate drive circuit. High DCDC converter efficiency of 98.67% at 1kW output power is achieved by using the driving circuit for a buck converter prototype with 200kHz switching frequency.
Today, the automatic separation of polymers from each other in an industrial scale is an unsolved problem. In laboratory environments, two methods are known whereby plastic is sorted either by color or by fluorescence decay time measurements that require fast synchronization and thus expensive equipment. A simple and pragmatic process is proposed to separate plastics from each other: all fluorescent photons are counted in relation to the absorbed photons. A theoretical model and an experimental setup are built in order to determine an apparatus specific quantum efficiency.
Nowadays, in software development usually various models and description fragments are created. Some of these artifacts describe the core of the application, such as the data model or the user interaction model. Other artifacts describe cross-cutting concerns, such as security or the requirement: “every change of data has to be confirmed by the user, before it is written into the database”. During the development process these artifacts are combined, transformed, and finally implemented manually or even automatically. For instance a designer may combine a model of a dialog component specifying an action, that changes data, with a common description of a generic confirm dialog. The integrated dialog description may be afterwards implemented by a programmer. The manual combination of both artifacts and the transformation of the combined description into code are errorprone and hard to change. Industry basically tests or actually uses two approaches for the combination today: 1. At the level of code and execution Aspect-Oriented Programming (AOP) is used [KLM+97]. An aspect defines a cross-cutting concern and the weaving instructions. A code weaver provides the actual weaving at compile or runtime. 2. At the level of design models and analysis models Model-Driven Software Development (MDSD) is used [KWB03]. A model in MDSD is a first class development artifact. Thus models are significantly more abstract than the implementation of code. However, these models cannot be executed. Hence, a abstract model is transformed into another, typically more detailed. A series of such transformations results in executable code. Thereby, a model transformer or a code generator reads some of the artifacts. The other artifacts and the combination rules are implemented in transformation rules or code generation templates (or somewhere else in the transformation / generation approach).
Environmental regulations force car manufacturers to renew the powertrain technology portfolio offered to the customer to comply with greenhouse gas (GHG) emission targets. In turn, automotive companies face the task of identifying the “right” powertrain technology portfolio consisting of, for example, internal combustion engines and electric vehicles, because the selection of a particular powertrain technology portfolio affects different company targets simultaneously. What makes this decision even more challenging is that future market shares of the different technologies are uncertain. Our research presents a new decision-support approach for assembling optimal powertrain technology portfolios while making decision-makers aware of the trade-offs between the achievable profit, the achievable market share, the market share risk, and the GHG emissions generated by the selected vehicle fleet. The proposed approach combines “a posteriori” decision-making with multi-objective optimization. In an application case, we feed the outlooks of selected market studies into the proposed decision-support system. The result is a visualization and analysis of the current real-world decision-making problem faced by many automotive companies. Our findings indicate that for the proposed GHG restriction at work in 2030 in the European Union, no optimal powertrain technology portfolio with less than 35% of vehicles equipped with an electric motor exists.
A new approach for the friction and wear characterisation of polymer fibres under dry, mixed, and hydrodynamic sliding conditions is developed. The production process of the tested polymer fibres is described and an introduction in fibre-reinforced concrete is given. Tribotesting is done on an optimised tribometer capable of measuring the friction and wear behaviour of polymer fibres with diameters of a few 100 µm under lubricated conditions. Three extruded polypropylene macro fibres with varying diameters are characterised under tribological conditions found in an industrial concrete mixing process. It is shown that detailed friction and wear data of polymer fibres can be gathered.
Being of a particular character "trust" eludes a purely strategic approach as it has to be balanced with the special needs of a fragile human resource. Still trust is a resource of strategic relevance since it establishes security, reduces complexity and opens potentials being bigger than simply the reduction of "cost of control". These characteristics are of special interest for management, as competent action is always bound to problem-solving-abilities under circumstances of insecurity. Further to this there are numerous situations where control is not applicable and where we simply don’t have full knowledge. Also trust is particular as it bypasses financial efforts and grows through usage; an investment of high amortization though that deserves closer attention of management especially valid in times of skilled worker shortage and with respect to organizational learning as a valuable competitive advantage.
The purpose of this paper is to develop direct and indirect measurable on the impact of trust on employee competence utilization and its subsequent realization in a strategic scorecard approach. This is based on a new integrative model in the organizational context applicable for strategicmanagement. With this the objective of the paper is to answer the question if culture of trust is an adequate instrument for strategic management. The empirical evaluation of the model is based on a survey directed to employees of selected companies in a pragmatic mixed methods application. The selected view considers the trust level of employees with regard to management as another individual and the organization itself; personal and institutional trust mechanisms are affected.Thus the employee evaluates its management.
As a result, the empirical study suggests that it is possible to specify frameworks for the establishment of trust, and that its impact on competence utilization is moreover significant. However the concept of trust creates challenging requirements on managers. Based on the assumption that employees basically think and act in the sense of the company, benevolence and goodwill must be involved to avoid an opportunistic approach. Further to this accountability, shared norms and freedom to take responsibility need to be proven by behavior.
The allowance of constructive seriously taken criticism, problem solving attitude, the acceptance of decisions are further key points for trust supporting frameworks that have to be accepted and shown through behavior.
Point mutations of the fibroblast growth factor receptor (FGFR)2 receptor in intrahepatic cholangiocarcinoma (iCC) are mainly of unknown functional significance compared to FGFR2 fusions. Pemigatinib, a tyrosine kinase inhibitor, is approved for the treatment of cholangiocarcinoma with FGFR2 fusion/rearrangement. Although it is hypothesized that FGFR2 mutations may cause uncontrolled activation of the signaling pathway, the data for targeted therapies for FGFR2 mutations remain unclear. In vitro analyses demonstrated the importance of the p.C382R mutation for ligand-independent constitutive activation of FGFR2 with transforming potential. The following report describes the clinical case of a patient diagnosed with an iCC carrying a FGFR2 p.C382R point mutation which was detected in liquid, as well as in tissue-based biopsies. The patient was treated with pemigatinib, resulting in a sustained complete functional remission in fluorodeoxyglucose-positron emission tomography/computed tomography over 10 months to date. The reported case is the first description of a complete functional remission under the treatment with pemigatinib in a patient with p.C383R mutation.
A Novel Approach to Identify Wood Species Optically using Fluorescence Lifetime Imaging Microscopy
(2021)
This contribution presents the results obtained with fluorescence lifetime imaging microscopy (FLIM)within the optical identification and differentiation of the four wood species walnut, beech, spruce, and maple. The experimental setup as well as the evaluation algorithm, with which the experiments were carried out, is explained briefly.
A novel approach to optically distinguish plastics based on fluorescence lifetime measurements
(2020)
In medical and biological research, fluorescence lifetime measurements and fluorescence lifetime imaging is already a part of the standardized analysis procedures. As first investigations have shown, polymers can be identified using fluorescence lifetime imaging and an evaluation algorithm. Thus, this contribution pursues a novel approach for the direct differentiation of four polymers with fluorescence lifetime imaging. Therefore, the evaluation algorithm is extended to compare several fluorescence lifetime images to prove that a distinction is possible.
A novel approach to optically distinguish plastics based on fluorescence lifetime measurements
(2020)
We present a novel lecture browser that utilizes ranked key phrases displayed on a stream graph to overcome the shortcomings of traditional extractive (query-based) summaries. The system extracts key phrases from the ASR transcripts, performs an unsupervised ranking, and displays an initial number of phrases on the stream graph. This graph gives an intuition of when which key phrase is spoken, and how dominant it is throughout the lecture. The user can select the phrases to be displayed and furthermore adjust the ranking of the all phrases. All user interactions are logged to a server to improve the ranking algorithms and provide user specific rankings.
A growing number of universities offer recordings of lectures, seminars and talks in an online e-learning portal. However, the user is often not interested in the entire recording, but is looking for parts covering a certain topic. Usually, the user has to either watch the whole video or “zap” through the lecture and risk missing important details. We present an integrated web-based platform to help users find relevant sections within recorded lecture videos by providing them with a ranked list of key phrases. For a user-defined subset of these, a StreamGraph visualizes when important key phrases occur and how prominent they are at the given time. To come up with the best key phrase rankings, we evaluate three different key phrase ranking methods using lectures of different topics by comparing automatic with human rankings, and show that human and automatic rankings yield similar scores using Normalized Discounted Cumulative Gain (NDCG).
The effectiveness of hot-melt coating depends on its uniformity and the extent to which the surface is completely covered. Compared with solvent-based coating, spreading is more limited in hot-melt coating; thus, the coating uniformity is more affected by the process parameters. This study presents a new method for identifying and quantifying factors influencing coating uniformity. The proposed method facilitates the determination of coating-thickness distribution and non-covered surface proportion based on micro-computed tomography measurements. The proposed method is based on particles that have undergone hot-melt coating in a fluidized bed, and it is compared to common methods for layer-thickness determination. The influencing factors are quantified in terms of the dependency of coating uniformity on the coating amount and material. Material properties have a significant impact because stearin and palm fat create different coating layers. The proposed method is confirmed to be well suited for analyzing coating qualities.
The objective of this research project is to develop a solar-powered refrigerator in the lower capacity range of up to 5 kW of cooling power. With the use of liquid pistons and one of the most efficient thermodynamic cycles known, the Stirling cycle, this product has the potential to outperform rival solar cooling technologies while providing inexpensive, reliable, quiet, environmentally-friendly, and efficient solar cooling for residential use, due to its straightforward manufacturing, simple design and inert working gas. Presented in this paper are the newest results of the theoretical and experimental investigation into deducing the key design parameters and system configuration of the so-called Liquid Piston Stirling Cooler (LPSC), which will help lead to optimal performance. Computer models of the complex unconstrained system have been constructed and validated using the modelling software Sage and shown to replicate system behavior with reasonable accuracy in experiments. The models have been used to predict system improvements and identify limitations imposed by the use of liquid pistons. The results to date provide a unique insight into a relatively little studied area in Stirling cycle research.
A Parametric Layout Study of Radiated Emission from High-Frequency Half-Bridge Switching Cells
(2016)
We present a numerical modeling study of radiated emission from half-bridge switching cells based on the method of moments (MoM). A low loop inductance cell design enables high-speed switching of power semiconductors which is demonstrated on a prototype circuit. The layout is further optimized for low radiated emission by variation of the heat sink placement. This is achieved by the heat sink attached to the phase terminal. The performance of the structure in terms of loop impedance, electric field radiation and sensitivity to cable attachments are numerically studied. Emission peaks arising from loop resonances can be reduced by over 20 dB with damping elements in the switching cell. The improved EMI performance of the proposed structure is attributed to decoupling of the loop current from the heat sink structure.
In this paper we present an algorithm that produces pitch and probability-of-voicing estimates for use as features in automatic speech recognition systems. These features give large performance improvements on tonal languages for ASR systems, and even substantial improvements for non-tonal languages. Our method, which we are calling the Kaldi pitch tracker (because we are adding it to the Kaldi ASR toolkit), is a highly modified version of the getf0 (RAPT) algorithm. Unlike the original getf0 we do not make a hard decision whether any given frame is voiced or unvoiced; instead, we assign a pitch even to unvoiced frames while constraining the pitch trajectory to be continuous. Our algorithm also produces a quantity that can be used as a probability of voicing measure; it is based on the normalized autocorrelation measure that our pitch extractor uses. We present results on data from various languages in the BABEL project, and show a large improvement over systems without tonal features and systems where pitch and POV information was obtained from SAcC or getf0.
A Planar Magneto-Inductive Device with Modulated Mutual Inductance for Wireless Power Transmission
(2023)
A Magneto-Inductive Wave (MIW) structure for wireless power transfer (WPT) is proposed with switchable mutual inductance between neighbored coupling coils. Orthogonally placed switchable short circuit loops are added in order to modulate the wave propagation properties of the MIW structure without changing the resonance frequency. We derive an analytical model based on lumped circuit analysis for the MIW waveguide and successfully evaluate the theoretical findings by circuit modeling and field simulation. We demonstrate switchable, i.e. addressable, guided wireless power transfer along an experimental MIW structure at a resonance frequency of 20.5 MHz.
Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-)effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.
Robust registration of two 3-D point sets is a common problem in computer vision.
The iterative closest point (ICP) algorithm is undoubtedly the most popular algorithm for solving this kind of problem. In this paper, we present the Picky ICP algorithm, which has been created by merging several extensions of the standard ICP algorithm, thus improving its robustness and computation time.
Using pure 3-D point sets as input data, we do not consider additional information like point color or neighborhood relations. In addition to the standard ICP algorithm and the Picky ICP algorithm proposed in this paper, a robust algorithm due to Masuda and Yokoya and the RICP algorithm by Trucco et al. are evaluated.
We have experimentally determined the basin of convergence, robustness to noise and outliers, and computation time of these four ICP based algorithms
A DDR2 SDRAM test setup implemented on the Griffin III ATE test system from HILEVEL Technologies is used to analyse the row hammer bug. Row hammer pattern experiments are compared to standard retention tests.
The analysis confirms that the row hammer effect is caused by a charge excitation process depending on the number of stress activation cycles. The stress has to occur in the local neighborhood of the cells under test.
Shallow impurity levels support the responsible charge carrier transport process in the used DDR2 SDRAM technology
An accurate SPICE model is proposed in this paper to calculate the power losses of high voltage converter systems. This supports power circuit optimization in the very first design stage. The parasitic package inductances and the nonlinear voltage dependent semiconductor capacitances are taken into account. In addition the high frequency behavior of the power inductor is investigated. A detailed loss breakdown is done to analyze the main sources of power loss and their physical reasons. The simulation results of the converter waveforms and of its efficiency match very well to the experimental results. The prototype boost converter shows a top efficiency greater than 98.3 % at a switching frequency of 1 MHz and an output voltage of 400 V at the maximum output power of 1 kW.
Voice scrambling is widely used to add privacy to the radio communication of various authorities - but is also used by criminals to evade prosecution. In this article, we consider various analog voice scrambling techniques such as fixed frequency inversion, splitband inversion and rolling code scramblers. We explain how to break them using automatically extracted measures and scoring algorithms, and evaluate the proposed system using simulated data. While the simple inversion can be easily broken, the more advanced techniques require additional work prior to unsupervised automatization; the presented user interface allows the user to refine the automatic results to obtain a high quality solution.
We present a novel split and merge based method for dividing a given metric map into distinct regions, thus effectively creating a topological map on top of a metric one. The initial metric map is obtained from range data that are converted to a geometric map consisting of linear approximations of the indoor environment.
The splitting is done using an objective function that computes the quality of a region, based on criteria such as the average region width (to distinguish big rooms from corridors) and overall direction (which accounts for sharp bends).
A regularization term is used in order to avoid the formation of very small regions, which may originate from missing or unreliable sensor data. Experiments based on data acquired by a mobile robot equipped with sonar sensors are presented, which demonstrate the capabilities of the proposed method.
The International Spinal Cord Injury (InSCI) community survey has been developed to collect internationally comparable data on the lived experience of persons with spinal cord injury (SCI) in all 6 WHO regions.
The InSCI survey provides a crucial first step to generate evidence on functioning, health maintenance, and subjective well-being in persons with SCI globally.
A major challenge in setting up the InSCI community survey was to develop a data model and questionnaire that comprehensively captures what matters to people and, at the same time, is feasible and parsimonious in terms of participant’s burden.
This paper outlines the components of the InSCI data model and presents the question selection to operationalize the data model along the 4 guiding principles of efficiency, feasibility, comparability, and truth and discrimination.
The data model consists of 6 components operationalized with 125 questions including functioning (n = 28 body functions and structures; n = 42 activities and participation), contextual factors (n = 26 environmental; n = 19 personal factors), lesion characteristics (n = 2), and appraisal of health and well-being (n = 8).
The InSCI questionnaire presents an efficient and feasible solution with satisfying comparability to other populations; however, its validity and reliability still needs to be confirmed.
This paper proposes an ultra-low inductance half-bridge switching cell with substrate integrated 650V GaN bare dies. A vertical parallel-plate waveguide structure with 100 μm layer thickness results in a commutation loop inductance of 0.5 nH resulting in a negligible drain-source voltage overshoot in the inductive load standard pulse test. On the other hand reliable circuit operation requires an assessment of the isolation strength of the thin dielectric layer in the main commutation loop, because critical high local electric fields might occur between the pads. Measurements of the dielectric breakdown voltage followed by a statistical failure analysis provide a characteristic life of 14.7 kV and a 10% quantile of 13.5kV in the Weibull fitted data. This characteristic life depends strongly on the ambient temperature and drops to 4.1kV at 125°C. Additionally, ageing tests show an increasing in dielectric breakdown voltage after 500h, 1000h and 2000h at 125°C high-temperature storage due to resin densification processes.
We describe a state-of-the-art large vocabulary continuous speech recognition (LVCSR) and keyword search (KWS) system trained on roughly 70 hours of conversational telephone speech. Using the Kaldi speech recognition toolkit, we investigate several aspects: for the acoustic front-end, we analyze the use of mel-frequency cepstral coefficients (MFCC), pitch and probability-of-voicing (PoV), and deep neural network (DNN) bottleneck (BN) features, as well as their feature-level combination ("tandem"). For the acousticphonetic decision tree, we explore different hidden Markov model (HMM) topologies for the glottalization phoneme /?/ to model its typically short duration. For the acoustic model, we compare regular continuous HMM with a sort of multi-codebook subspace Gaussian mixture model (SGMM) that lead to an overall best word error rate (WER) of 58.7% and 56.3%, respectively. The KWS is implemented as a word lattice search, and is augmented by a syllable lattice back-up search to capture out-of-vocabulary keywords as well as misrecognized lexical surface forms due to ambiguous prefix and hyphenation rules.
For cost–benefit analysis, health technologies with multiple effects should be valued in a single scenario by a holistic willingness‐to‐pay (WTP) measure. Recent studies instead used decomposed scenarios in which respondents report their WTP for each individual effect. Evidence can be found that the sum of such decomposed WTPs overestimates the holistic WTP, i.e. the holistic WTP is sub‐additive. This sum of decomposed WTPs can lead to wrong conclusions on the efficiency of health technologies. This is also relevant in decision making about new technologies that are valued separately in different surveys. To date, no utility‐theoretical and empirically validated aggregation function for decomposed WTPs exists. Within an expected utility model, this paper identifies as a reason for sub‐additivity – beside risk aversion with respect to wealth – a negative influence of better health on the marginal utility of wealth, i.e. marginal utility of wealth is smaller in better health states. Assuming mutual utility independence of health and wealth, a theoretically founded aggregation function covering these two impacts is derived. In a contingent valuation study, 92 patients with diabetes were asked to state their WTP for reductions of the risk of several diabetic complications in decomposed as well as in holistic scenarios. The patients had preferences with a significant negative influence of health on the marginal utility of wealth. Sub‐additivity occurred and theoretically founded aggregation could considerably lower the degree of overestimation. These results suggest that the theoretically founded aggregation function might reduce problems of sub‐additivity that can be economically relevant. Further empirical testing of the approach is indicated.
In einem beispielhaften Pilotprojekt wird das ehemalige Kasernengelände bei Bad Aibling zur "Nullenergiestadt" saniert und modernisiert. Dabei beschäftigt sich das Projekt mit Fragen des energieeffizienten Bauens in Bestand und Neubau, sowie mit der ganzheitlichen Betrachtung von Gebäuden im Verbund zu Siedlungen, Kommunen und Städten.
Im Rahmen eines durch das BMWi geförderten Forschungsprojekts wird das Vorhaben wissenschaftlich begleitet. Die Hochschule Rosenheim übernimmt im Projekt das energetische Monitoring und dessen Auswertung. Neben dem Monitoring der Einzelgebäude ist dabei ein wesentlicher Bestandteil die Betrachtung eines niedrigtemperierten Nahwärmenetzes, das im Sommer ausschließlich über große Solarthermieflächen gespeist wird.
In diesem Low-Ex-Ansatz sind auch Innovationen wie die "Fassadenheizung", die Untersuchung dezentraler vs. zentraler Lüftungsanlagen in der Gebäudesanierung und die Planung höchst energieeffizienter Gebäude ein Thema.
Im vom BMWi geförderten Forschungsprojekt Niedrigst-Energie-Hotel wurde ein Bestandsgebäude mithilfe eines umfassenden Sanierungskonzeptes zu einem energiesparenden Nur-Strom-Gebäude entwickelt. Eine höchst wärmegedämmte Hülle, komplett neue Anlagentechnik und intelligente Verknüpfung der Anlagenteile versprechen niedrigen Energiebedarf bei gleichzeitig hohem Komfort.
Doch mit steigender Komplexität der Anlagen steigt auch das Risiko, dass Bestandteile nicht optimal zusammenarbeiten oder sich sogar gegenseitig negativ beeinflussen. Beim energetischen Monitoring werden diese Anlagen und ihre Bestandteile überwacht und deren Energieströme gemessen. Dieses Wissen erlaubt schließlich die detaillierte Bewertung des innovativen Energiekonzepts in der Anwendung und in gewissem Umfang auch die Optimierung der Anlagen.
Zur Auswertung der umfangreichen Datensätze wird das am Karlsruher Institut für Technologie (Fachgebiet Bauphysik & Technischer Ausbau) entwickelte Werkzeug MoniSoft verwendet. Durch die plattformunabhängige Software werden das Monitoring und die Betriebsanalyse vereinfacht. Seit 2013 wird die Software an der Hochschule Rosenheim maßgeblich mit weiterentwickelt.
Projektziele - Durch das energetische Monitoring können folgende Ziele erreicht werden:
Beurteilung des Betriebs der komplexen Gebäudetechnik unter realen Bedingungen, Vergleich der Messergebnisse mit den Erwartungen der Planer, Erarbeitung von Optimierungsmöglichkeiten in der Gebäudetechnik
Ziel der Gemeinde Höhenkirchen-Siegertsbrunn war es, ein Kinderhaus (Kindergarten, Hort und Kinderkrippe) zu errichten, das in der Jahresbilanz mehr Energie erzeugt als es selbst verbraucht (Plus-Energie-Gebäude). Im Rahmen eines durch das BMWi geförderten Forschungsprojekts wurde das Vorhaben wissenschaftlich begleitet.
Die Hochschule Rosenheim begleitete das Vorhaben messtechnisch mit dem Ziel der Betriebsoptimierung und der nachvollziehbaren Validierung der angestrebten Ziele. Zur Auswertung der umfangreichen Datensätze wurde das am Karlsruher Institut für Technologie (Fachgebiet Bauphysik & Technischer Ausbau) entwickelte Werkzeug MoniSoft verwendet. Durch die plattformunabhängige Software werden das Monitoring und die Betriebsanalyse vereinfacht. Seit 2013 wird die Software an der Hochschule Rosenheim maßgeblich mit weiterentwickelt.
Der Beitrag gibt einen Überblick über die verschiedenen Abweichungen, die im Bauordnungsrecht behandelt werden. Die Abweichungen von den materiellen Anforderungen des Bauordnungsrechtes werden nur kurz behandelt. Abweichungen bei den Nachweisen für Bauprodukte und Bauarten werden ausführlich diskutiert hinsichtlich möglicher Ursachen und den Auswirkungen für die Nachweisführung. Auch auf das Thema "Abweichungen" bei Bauprodukten mit CE-Kennzeichnung nach BauPVO wird eingegangen. Die Abweichungen bei Technischen Baubestimmungen für Planung, Bemessung und Ausführung werden mit Bezug auf die VV TB erläutert.
This study investigates the impact of generative AI systems like ChatGPT on semi-structured decision-making, specifically in evaluating undergraduate dissertations. We propose using Davis’ technology acceptance model (TAM) and Schulz von Thun’s four-sides communication model to understand human–AI interaction and necessary adaptations for acceptance in dissertation grading. Utilizing an inductive research design, we conducted ten interviews with respondents having varying levels of AI and management expertise, employing four escalating-consequence scenarios mirroring higher education dissertation grading. In all scenarios, the AI functioned as a sender, based on the four-sides model. Findings reveal that technology acceptance for human–AI interaction is adaptive but requires modifications, particularly regarding AI’s transparency. Testing the four-sides model showed support for three sides, with the appeal side receiving negative feedback for AI acceptance as a sender. Respondents struggled to accept the idea of AI, suggesting a grading decision through an appeal. Consequently, transparency about AI’s role emerged as vital. When AI supports instructors transparently, acceptance levels are higher. These results encourage further research on AI as a receiver and the impartiality of AI decision-making without instructor influence. This study emphasizes communication modes in learning-ecosystems, especially in semi-structured decision-making situations with AI as a sender, while highlighting the potential to enhance AI-based decision-making acceptance.
Natural Language Processing, such as speech-to-text technology, is increasingly implemented in collaboration software that is used by global virtual teams (GVT). GVT collaboration has become ubiquitous and has additionally accelerated during the COVID-19 pandemic. The main issues of global virtual teams are technology difficulties, language and time zone differences, and lower levels of psychological safety. Advances in collaboration technology aim at improving collaboration for GVT. But we know little about the acceptance of these technologies. Therefore, the objective of this study is to explore how Millennial and Gen Z members of GVT accept speech-to-text technology; namely, automated captions in virtual conferences and automated meetings transcripts. Particularly, we are comparing antecedents of acceptance across levels of language proficiency and psychological safety. We surveyed 530 users of speech-to-text technology in GVT both before and after they used the technology. The pre-survey was administered before the COVID-19 pandemic hit; when participants completed the post-survey all were under some degree of lockdown. Results suggest that use of the technology reduces anxiety and effort, but decreases performance expectation and hedonic motivation. Non-native speakers rate the technology more positively. The impact of psychological safety is limited to self-efficacy and anxiety.
Functioning information constitutes a relevant component for determining patients’ service needs and respective resource use. Diagnosis-Related Group (DRG) systems can be optimized by integrating functioning information.
First steps toward accounting for functioning information in the German DRG (G-DRG) system have been made; yet, there is no systematic integration of functioning information. The G-DRG system is part of the health system; it is embedded in and as such dependent on various stakeholders and vested interests.
This study explores the stakeholder’s perspective on integrating functioning information in the G-DRG system. A qualitative interview study was conducted with national stakeholders in 4 groups of the G-DRG system (health policy, administration, development, and consultations).
Interviews were analyzed using inductive thematic analysis. In total, 14 interviews were conducted (4 administration and 10 consultation group). Three main themes were identified: (1) functioning information in the G-DRG system: opportunities and obstacles, (2) general aspects concerning optimizing G-DRG systems by integrating additional information, and (3) ideas and requirements on how to proceed.
The study offers insights into the opportunities and obstacles of integrating functioning information in the G-DRG system. The relevance of functioning information was evident. However, the value of functioning information for the G-DRG system was seen critically. Integrating functioning information alone does not seem to be sufficient and a systems approach is needed.
Outdoor performance analyses of photovoltaic modules can be advantageous compared to indoor investigations, as they take into account the influences of natural test conditions on the modules. However, such outdoor performance assessments usually suffer from poor accuracies due to undefined test conditions for the modules. This paper reports on a comprehensive concept for improved outdoor analysis which results in performance data with indoor laboratory precision. The approach delivers current-voltage characteristics for even more test conditions than required by the standard IEC 61853-1. Hence, curves of modules’ electrical parameters above irradiance can be deduced for any temperatures. The concept allows precise determination of temperature coefficients for user-defined irradiances taking into account outdoor effects like light-soaking or light-induced degradation. The calibration and measurement uncertainty of the presented outdoor analysis method is evaluated quantitatively. For the measurements an advanced outdoor set-up was used.
Evolutionary psychologists propose that human cognition evolved through natural selection to solve adaptive problems related to survival and reproduction, with its ultimate function being the enhancement of reproductive fitness. Following this proposal and the evolutionary-developmental view that ancestral selection pressures operated not only on reproductive adults, but also on pre-reproductive children, the present study examined whether young children show superior memory for information that is processed in terms of its survival value. In two experiments, we found such survival processing to enhance retention in 4- to 10-year-old children, relative to various control conditions that also required deep, meaningful processing but were not related to survival. These results suggest that, already in very young children, survival processing is a special and extraordinarily effective form of memory encoding. The results support the functional-evolutionary proposal that young children’s memory is “tuned” to process and retain fitness-related information.
Circular hollow sections of beech-laminated veneer lumber (LVL) for the use as temporary geotechnical soil nailing systems are currently being developed. Due to their permanent subsoil cement embedment, investigations of the bond line quality of the timber sections are essential. This paper presents the bonding properties of flat and curved beech LVL after cyclic conditioning in a water–cement grout solution aimed at inducing short- and long-term alkaline attack of the timber. In total, 409 and 69 samples were tested in tensile shear tests after short-term and long-term conditioning, respectively.
Three different adhesive systems, a one-component polyurethane adhesive, a melamine–urea–formaldehyde adhesive and a melamine–urea–formaldehyde adhesive modified by means of (polyvinyl)-acetate adhesive were investigated and compared. Short-term conditioning by submersion in boiling cement suspension was found to be a reliable method for testing the bonding performance rather than long-term conditioning. In the case of tensile shear tests of samples subjected to long-term treatment, wood material strength was the decisive criterion. Generally, tensile shear test samples of all investigated adhesives achieved reliable bonding for a pressing force up to 1.0 MPa. No relationship was recognised between the determined bonding failure and the wood properties tensile shear strength, wood failure percentage, fracture pattern and bulk density of veneers adjacent to the bond line.
For the determination of the bond line integrity of curved veneer poles, it was necessary to test bonding quality in a combined test using curved and flat samples and to compare tensile shear strength with data determined on reference samples in the same veneer population without bond line.
Innovative circular, hollow, laminated veneer lumber (LVL) beech sections for use as temporary geotechnical soil reinforcement members are currently being developed. Appropriate surface gluing quality between the veneers is fundamental to this subsoil application of the permanently cement-embedded, engineered timber product. The circular cross-section geometry and the permanently high-alkaline environment of the structural member is not covered by presently standardized testing and conditioning methods for examining LVL surface bond line quality. The sample conditioning and tensile shear test method compliant with EN 302-1 (Adhesives for load-bearing timber structures—test methods—part 1: determination of longitudinal tensile shear strength, European Committee for Standardization, Brussels, 2013) was modified to determine bonding parameters for circular, hollow LVL sections. Bond line curvature, groove cutting depth and sample geometry were found to greatly influence stress distribution, percentage of wood failure and tensile shear strength.
Short-term alkaline treatment of test samples did not significantly influence the bonding performance, wood failure percentage, tensile shear strength and fracture patterns. To improve tensile shear strength, adhesives with different material rigidities were used and compared. An orthotropic, elastic numerical analysis revealed a greater influence of adherent elasticity than adhesive elasticity on the stress distribution within the bond line. With regard to determining the bond line integrity of curved veneer poles, a sample geometry compliant with EN 302-1 (2013) was developed and numerically evaluated.
Hintergrund und Fragestellung: Bisphosphonate sind eine effektive Therapie der Osteoporose. Unverträglichkeiten, insbesondere gastrointestinale Nebenwirkungen, sind häufig Ursache von Therapieabbrüchen. Ziel der Untersuchung ist, Unterschiede der Adhärenz (Akzeptanz, Persistenz/Therapieabbruch, Compliance) zwischen täglicher (ALD-D) und wöchentlicher (ALD-W) Einnahmefrequenz bei dem Bisphosphonat Alendronat zu untersuchen.
Patienten und Methoden: Die Analyse beruht auf Verschreibungsdaten zweier Zufallsstichproben von jeweils 144 Patientinnen ab einem Alter von 45 Jahren mit täglicher bzw. wöchentlicher Einnahmefrequenz über einen Zeitraum von 12 Monaten ab Initiierung der Therapie. Ein Therapieabbruch war definiert als das Aufbrauchen der letzten Verschreibung. Akzeptanz wurde am Anteil der Patientinnen ohne Therapieabbruch nach der ersten Verschreibung gemessen. Compliance wurde am MPR (Medication possession ratio = prozentualer Anteil der Tage im Jahr, an denen Medikation vorliegt) gemessen. Therapeutisch relevante Compliance, die zur Reduktion von Frakturrisiken führte, war bei einem MPR >80 % gegeben.
Ergebnisse: 31,3 % (ALD-W) vs. 45,8 % (ALD-D) der Patienten brachen die Therapie nach der 1. Verschreibung, 53,5 % (ALD-W) vs. 72,2 % (ALD-D) brachen die Therapie im Verlauf eines Jahres ab. Die Abbrecherrate war unter täglicher Einnahme signifikant höher (p = 0,0035). Die durchschnittliche Dauer bis zum Therapieabbruch inklusive Therapieunterbrechungen betrug 220 Tage (ALD-W) vs. 169 Tage (ALD-D). Die durchschnittliche Compliance aller Patienten betrug 51,7 % (ALD-W) vs. 37,7 % (ALD-D). 30,6 % (ALD-W) vs. 19,2 % (ALD-D) (p = 0,0295) aller Patienten erreichten eine therapeutisch relevante Compliance.
Folgerungen: Eine große Zahl von Patienten brach die Therapie mit Bisphosphonaten ab, viele schon nach der ersten Verschreibung. Die Adhärenz wurde durch die Verringerung der Einnahmefrequenz verbessert, blieb aber suboptimal. Es besteht ein Bedarf für Therapien, die zu besserer Therapietreue führen können.
Objectives
To examine which professionals administered which assessment instruments in which patient in clinical practice during first rehabilitation after newly acquired spinal cord injury (SCI) and the differences in the frequencies of different assessments between patient groups.
Setting
Specialized SCI acute care and rehabilitation clinic.
Methods
Patients after SCI, aged 18 years and above, admitted for first rehabilitation between December 2014 and December 2015 were analyzed. Descriptive statistics of 54 selected assessments. p values based on the χ 2 test were calculated for assessments used in both paraplegic and tetraplegic patients.
Results
One hundred and nineteen patients were screened. Forty-one assessments were administered, of which 10 on average more than once per patient. The most frequently used assessments were Spinal Cord Independence Measure III (7.7 times per patient), Skin Assessment (3.6 times), and Manual Muscle Test (3.2 times for Lower Extremities; 2.5 times for Upper Extremities). The American Spinal Injury Association Impairment Scale was administered on average 1.9 times per patient. More variation in the number of assessments per patient was observed in patients with complete and incomplete lesions compared to patients with paraplegia and tetraplegia.
Conclusion
Assessments covering neurological functioning, mobility, and self-care are used in clinical practice during first rehabilitation of patients with SCI, while others covering autonomic functioning, pain, participation, or quality of life are still missing. Based on these observations and national and international requirements, a meaningful standard for an assessment toolkit, applicable in general and in specific subgroups, needs to be defined and implemented.
The advanced Attitude and Orbit Control System (AOCS) design of the Small GEO platform is now being adapted for the first commercial mission.
The Small GEO telecommunications satellite is a new development to fill a niche in the telecom industry for small platforms weighing about 1.5 tonnes and targeting payloads of 300 kg and 3 kW. The first mission will launch into Geostationary Transfer Orbit (GTO).
Small GEO is being developed by a Consortium led by OHBSystem AG. The Swedish Space Corporation is a partner in the Consortium and supplies the AOCS and Electric Propulsion (EP) subsystems. The project is currently in Phase C and the first mission will fly in 2012. This article gives an overview of the AOCS development status.
The AOCS architecture is a three-axis stabilized system using reaction wheels for attitude control, star trackers for attitude determination, and EP for orbit control. The AOCS software is being developed using model-based design techniques and test driven development. Results from subsystem level testing of flight code will be presented.
The AOCS design is characterized by a number of advances in technology beyond traditional telecom satellite designs. Perhaps the largest deviation from a traditional design is complete reliance on EP for orbit control. Angular momentum management of the reaction wheels relies solely upon EP in the nominal modes. The EP is not used in the safe modes and therefore a cold gas system is included On-board.
The cold gas system uses Xenon, the same fuel used by the EP. Another advance is the reliance upon APS-based star trackers. APS (Active Pixel Sensor) star trackers have a number of advantages over their CCD-based cousins in terms of robustness. The traditional fine sun sensor is simplified to a fault tolerant system of solar cells giving low, but more than adequate, accuracy.
In addition, a GPS sensor will be flown on-board as an experiment.
Objective: Many studies published in the journal WORK in the recent decades have discussed work and employment trends. However, the dimensions of these contributions over time have not been reviewed. The main objective of this study was to investigate the knowledge development in regard to work-related rehabilitation in WORK over the last two decades.
Methods: A scoping review was conducted using the following five stages: (i) identifying research question, (ii) identifying relevant studies, (iii) study selection, (iv) charting, summarizing, and collating the data, and (v) reporting the results. Studies were selected from the WORK Article Database.
Results: Seventy-five relevant studies were identified. The findings reflect that WORK has published papers from across the world, with most of the studies from the United States, Sweden, Canada, and Hong Kong. The complexity and multi-factorial nature of work-related rehabilitation was reflected in the application of quantitative, qualitative, and mixed method research approaches, as well as case studies. Study participants were characterized by work, and non-work related injuries, systematic diseases/chronic illness, fulfilled certain socio-demographic characteristic, and represented various stakeholders. Fewer studies drew on secondary resources. In the findings one re-occurring theme has been noted: 'maintaining/obtaining/returning to secure and stable employment/work'.
Conclusions: Four key-reflections evolved from this scoping review that provide potential avenues for future research. These key-reflections include (i) the national, transnational and international dimension of the reviewed studies, (ii) the various societal levels informing work-related rehabilitation practices, (iii) the diversity of methodologies applied in current research, and (iv) the variability of terminology used within the reviewed studies. The journal WORK has published a variety of research over the last two decades and contributed significantly to our current understanding of work-related rehabilitation. However, further research in these reflective areas would expand the current knowledge base.
Providing a subset of previously studied information as a retrieval cue can impair memory for the remaining information. Previous work with adults has shown that such part-list cuing impairment (PLCI) can be transient or lasting, depending on study condition. Here, we investigated the persistence of PLCI in children. Three age groups (7- and 8-year-olds, 9- and 10-year-olds, and 12- to 14-year-olds) learned a list of items, either through a single study trial (1-study condition) or through two study-test cycles (2-study-test condition). Subsequently, two recall tests were administered, with part-list cues being provided in the first (critical) test but not in the second (final) test. Of primary interest was whether the detrimental effect of part-list cuing induced in the critical test would persist to the uncued final test. In 12- to 14-year-olds, we found an adult-like pattern of results, with lasting impairment in the 1-study condition but transient impairment in the 2-study-test condition. In contrast, in the two younger age groups, we found PLCI to be lasting in both study conditions, suggesting age differences in the persistence of PLCI. The results are discussed in light of a recently proposed two-mechanism account of PLCI that attributes lasting impairment to retrieval inhibition and transient impairment to strategy disruption. Following this account, the results suggest that whereas 12- to 14-year-olds’ PLCI was caused by (lasting) retrieval inhibition in the 1-study condition and by (transient) strategy disruption in the 2-study-test condition, 7- and 8-year-olds’ and 9- and 10-year-olds’ PLCI was caused by (lasting) retrieval inhibition in both study conditions.
Agiles Projektmanagement
(2015)