Externe Publikationen
Refine
Year of publication
- 2015 (84) (remove)
Document Type
- conference proceeding (article) (33)
- Article (28)
- conference proceeding (presentation, abstract) (9)
- Part of a Book (7)
- Book (5)
- Moving Images (1)
- Patent (1)
Is part of the Bibliography
- no (84)
Keywords
- additive manufacturing (4)
- Multi-Material Parts (3)
- Alkoholabhängigkeit (2)
- Kooperation (2)
- NewMaterials (2)
- Selbsthilfe (2)
- Simultaneous Laser Beam Melting of Polymers (2)
- Telemedizin (2)
- Wirkmechanismen (2)
- ACT-R (1)
Institute
- Fakultät Angewandte Sozial- und Gesundheitswissenschaften (30)
- Fakultät Maschinenbau (23)
- Fakultät Informatik und Mathematik (20)
- Labor Additive and Intelligent Manufacturing for Sustainability (AIMS) (8)
- Regensburg Center of Health Sciences and Technology - RCHST (8)
- Labor CAD/CAE (7)
- Labor Mehrkörpersimulation (MKS) (7)
- Fakultät Angewandte Natur- und Kulturwissenschaften (5)
- Labor Logopädie (LP) (4)
- Labor Physiotherapie (LPh) (4)
Begutachtungsstatus
- peer-reviewed (19)
This paper is based on the talk of the same title delivered at the “Dock of Learning” Symposium. I reflect on design research with an eye on the critical issues of our time. Design research has the potential to take on these issues; however, there are challenges. A threatening one, I see, is the divergence of design research and art research. I attempt to tackle this by examining Jonas’ Research through Design and Borgdorff’s Artistic Research with help from John Dewey’s philosophy.
The 0-1 multidimensional knapsack problem (MKP) is a well-known combinatorial optimization problem with several real-life applications, for example, in project selection. Genetic algorithms (GA) are effective heuristics for solving the 0-1 MKP. Multiple individual GAs with specific characteristics have been proposed in literature. However, so far, these approaches have only been partially compared in multiple studies with unequal conditions. Therefore, to identify the “best” genetic algorithm, this article reviews and compares 11 existing GAs. The authors' tests provide detailed information on the GAs themselves as well as their performance. The authors validated fitness values and required computation times in varying problem types and environments. Results demonstrate the superiority of one GA.
Design Research in Germany?
(2015)
Design Research in Germany?
(2015)
Der Samariter ist eine Leitvorstellung für das Helferhandeln von hoher suggestiver Kraft und wird auch für die in der Pflege Tätigen in Anspruch genommen. Nach einer Phase der reflexiven Problematisierung der Hilfemotivation erfährt der Samariter im Zeichen der Positiven Psychologie heute eine erneute Bekräftigung. Diese Affirmation bezieht sich aber vor allem auf die Motivation zur Hilfe. Problematisch bleibt der für das berufliche Helferhandeln zentrale Aspekt der fachlichen Kompetenz, den das Leitbild nicht transportiert.
EMDLAB: A toolbox for analysis of single-trial EEG dynamics using empirical mode decomposition
(2015)
Background:
Empirical mode decomposition (EMD) is an empirical data decomposition technique. Recently there is growing interest in applying EMD in the biomedical field.
New method:
EMDLAB is an extensible plug-in for the EEGLAB toolbox, which is an open software environment for electrophysiological data analysis.
Results:
EMDLAB can be used to perform, easily and effectively, four common types of EMD: plain EMD, ensemble EMD (EEMD), weighted sliding EMD (wSEMD) and multivariate EMD (MEMD) on EEG data. In addition, EMDLAB is a user-friendly toolbox and closely implemented in the EEGLAB toolbox.
Comparison with existing methods:
EMDLAB gains an advantage over other open-source toolboxes by exploiting the advantageous visualization capabilities of EEGLAB for extracted intrinsic mode functions (IMFs) and Event-Related Modes (ERMs) of the signal.
Conclusions:
EMDLAB is a reliable, efficient, and automated solution for extracting and visualizing the extracted IMFs and ERMs by EMD algorithms in EEG study.
As part of a research project to develop design solutions for concrete-masonry buildings for the Canadian market, the apparent sound insulation performance of hybrid assemblies with concrete masonry walls and wood joist floors was evaluated. In this paper, the effect of junction coupling is investigated in an ISO 15712 flanking prediction context. Airborne flanking path data predicted according to ISO 15712 are compared to data measured using the indirect ISO 10848 shielding
method. Recommendations are made on how appropriate the application of ISO 15712 is for this type of hybrid assembly.
The performance of cognitive models often depends on the settings of specific model parameters, such as the rate of memory decay or the speed of motor responses. The systematic exploration of a model’s parameter space can yield relevant insights into model behavior and can also be used to improve the fit of a model to human data. However, exhaustive parameter space searches quickly run into a combinatorial explosion as the number of parameters investigated increases. Taking an established instance-based learning task as example, we show
how simulation using parallel computing and derivative-free optimization methods can be applied to investigate the effects
of different parameter settings. We find that both global optimization methods involving genetic algorithms as well as local methods yield satisfactory results in this case. Furthermore, we show how a model implemented in a specific cognitive architecture (ACT-R) can be mathematically reformulated to prepare the application of derivative-based optimization methods which promise further efficiency gains for quantitative analysis.
It is simple to query a relational database because all columns of the tables are known and the language SQL is easily applicable. In NoSQL, there usually is no fixed schema and no query language. In this article, we present NotaQL, a data-transformation language for wide-column stores. NotaQL is easy to use and powerful. Many MapReduce algorithms like filtering, grouping, aggregation and even breadth-first-search, PageRank and other graph and text algorithms can be expressed in two or three short lines of code.
PURPOSE
Reconstruction of x-ray computed tomography (CT) data remains a mathematically challenging problem in medical imaging. Complementing the standard analytical reconstruction methods, sparse regularization is growing in importance, as it allows inclusion of prior knowledge. The paper presents a method for sparse regularization based on the curvelet frame for the application to iterative reconstruction in x-ray computed tomography.
METHODS
In this work, the authors present an iterative reconstruction approach based on the alternating direction method of multipliers using curvelet sparse regularization.
RESULTS
Evaluation of the method is performed on a specifically crafted numerical phantom dataset to highlight the method's strengths. Additional evaluation is performed on two real datasets from commercial scanners with different noise characteristics, a clinical bone sample acquired in a micro-CT and a human abdomen scanned in a diagnostic CT. The results clearly illustrate that curvelet sparse regularization has characteristic strengths. In particular, it improves the restoration and resolution of highly directional, high contrast features with smooth contrast variations. The authors also compare this approach to the popular technique of total variation and to traditional filtered backprojection.
CONCLUSIONS
The authors conclude that curvelet sparse regularization is able to improve reconstruction quality by reducing noise while preserving highly directional features.
Early meta-level: deeper understanding of connectivity-states and consequences for state definition
(2015)
Bei Forderungen nach einer besseren Demenzversorgung ist zu beachten, dass es bereits heute Ansätze gibt, die eine patientenorientierte Langzeitversorgung für Menschen mit Demenz praktizieren. Ein Beispiel ist das Versorgungsmodell in Stralsund. Darin nimmt der Facharzt eine zentrale Rolle ein, obwohl die Versorgung nicht arztzentriert, sondern eher koproduktiv und teamorientiert charakterisiert ist.
Steigende Fallzahlen von Menschen mit Demenz, der komplexe Bedarf und eine unzureichende Versorgungsrealität erfordern innovative und nachhaltige Lösungsansätze. Der Beitrag stellt ein Versorgungsmodell für Menschen mit Demenz vor, das infolge akuter Versorgungsprobleme entwickelt wurde und das eine umfassende, patientenorientierte Versorgung über Sektorengrenzen hinweg umsetzt. Beschrieben werden Struktur und Bedingungen gelingender Kooperation und umfassender Versorgung.
Artifacts in Incomplete Data Tomography with Applications to Photoacoustic Tomography and Sonar
(2015)
We develop a paradigm using microlocal analysis that allows one to characterize the visible and added singularities in a broad range of incomplete data tomography problems. We give precise characterizations for photoacoustic and thermoacoustic tomography and sonar, and provide artifact reduction strategies. In particular, our theorems show that it is better to arrange sonar detectors so that the boundary of the set of detectors does not have corners and is smooth. To illustrate our results, we provide reconstructions from synthetic spherical mean data as well as from experimental photoacoustic data.
Tools to predict sound transmission through building elements are beneficial from both amarketing and a research standpoint. Prediction models can assist with the optimization ofelement performance and often reduce both cost and time in projects. However, the morephysical phenomena are to be included in the model, the more input data that is needed,which in some cases can be quite time-consuming and costly to collect. A balance is neededbetween the input and modeling effort and the output, which is usually delivered in onethird octave frequency bands or single number ratings. Many different types of models arecommonly used, including analytical, numerical, and empirical are used, all of which haveadvantages and disadvantages. In this paper, the latter type is presented using a set ofphysical properties of lightweight wood elements, and the errors made in the prediction arequantified. The disadvantage of empirical compared to analytical modeling is that thephysical phenomena are more difficult to identify. The advantage is often the simplicityand accuracy of the prediction results.
We propose a new algorithmic approach to the non-smooth and non-convex Potts problem (also called piecewise-constant Mumford–Shah problem) for inverse imaging problems. We derive a suitable splitting into specific subproblems that can all be solved efficiently. Our method does not require a priori knowledge on the gray levels nor on the number of segments of the reconstruction. Further, it avoids anisotropic artifacts such as geometric staircasing. We demonstrate the suitability of our method for joint image reconstruction and segmentation. We focus on Radon data, where we in particular consider limited data situations. For instance, our method is able to recover all segments of the Shepp–Logan phantom from seven angular views only. We illustrate the practical applicability on a real positron emission tomography dataset. As further applications, we consider spherical Radon data as well as blurred data.
Analysen auf NoSQL-Datenbanken sind oft langdauernd und die Ergebnisse fur den Benutzer haufig schwer verst ̈andlich. Wir prasentieren eine Möglichkeit, Datenmengen aus Wide-Column Stores mittels der Transformationssprache NotaQL zu transformieren sowie zu aggregieren und die Ergebnisse in Form von Diagrammen dem Benutzer darzustellen. Dabei kommen Sampling-Techniken zum Einsatz, um die Berechnung auf Kosten der Genauigkeit zu beschleunigen. Das von uns verwendete iterative Samplingverfahren sorgt fur eine kontinuierliche Verbesserung der Berechnungsgenauigkeit und bietet zudem Möglichkeiten zur Genauigkeitsabschätzung, die in Form von Konfidenzintervallen in den Diagrammen dargestellt werden kann.
Seit in den neunziger Jahren in Deutschland begonnen wurde das Konzept des Neuen Steuerungsmodels ("New Public Management") einzuführen, werden Hochschulen stärker als bisher an ihrer Leistungsfähigkeit gemessen. Um die öffentlichen Verwaltungen zu modernisieren, werden hierbei Managementtechniken aus der privaten Wirtschaft zur strategischen Steuerung an Hochschulen und an anderen öffentlichen Verwaltungen übernommen. Die Hochschulen werden verstärkt als Dienstleister wahrgenommen und stärker an ihrem tatsächlichen Output gemessen. Parallel dazu hat der internationale Wettbewerb zwischen Hochschulen zugenommen. Beide Entwicklungen erfordern von den Hochschulen, effektive Managementstrukturen zu entwickeln, die ihre Qualität und Leistungsfähigkeit sichern und systematisch weiterentwickeln. Hierbei dürfen aber die für den Erfolg der Hochschule essenziellen Faktoren wie Kreativität, Innovation und Diversität nicht eingeschränkt, sondern müssen vielmehr gestärkt werden. Zudem weisen Hochschulen mit ihren demokratischen hochschulpolitischen Gremien sowie der grundgesetzlich garantierten Freiheit von Forschung und Lehre besondere Strukturen auf, die sich grundsätzlich von denen privatwirtschaftlicher Unternehmen unterscheiden. Viele Aspekte klassischer Managementsysteme und die Prozesse zu deren Einführung können also nicht eins zu eins auf Hochschulen übertragen werden. In der DGQ-Arbeitsgruppe 360 "Einführung von Qualitätsmanagement an Hochschulen" haben Experten aus verschiedenen Hochschulen ihre Erfahrungen und ihr Wissen zusammengetragen, wie ein ganzheitliches Qualitätsmanagementsystem den besonderen Zielsetzungen und Strukturen von Hochschulen gerecht werden kann. In diesem Praxisbuch zeigen die Autoren Schritt für Schritt, wie ein solches Qualitätsmanagementsystem eingeführt und gestaltet werden kann und wie es die Qualität und Leistungsfähigkeit der Hochschule erhöhen kann. Dabei wird ein starker Fokus auf die Prozessorientierung und den Einsatz von Verbesserungszyklen gelegt. Die relevanten Standards und Normen werden beschrieben und die wichtigsten Begriffe geklärt. Besonders wertvoll sind aber sicher die vielen Beispiele aus über 20 Hochschulen und die Erläuterungen zu typischen Stolpersteinen und wichtigen Erfolgsfaktoren - zum Beispiel die große Bedeutung von Kommunikation und Projektmanagement bei der Einführung eines Qualitätsmanagementsystems. Mit diesem Buch hat die Arbeitsgruppe eine Hilfestellung für Praktiker erarbeitet, die sich in Präsidien, Verwaltungsreferaten und den Fachbereichen der Hochschulen mit dem integrierten Management, der Evaluation von Forschung, Studium und Lehre oder der Qualitätssicherung beschäftigen. Zudem leistet sie einen wertvollen Beitrag, den Qualitätsgedanken auf die spezielle Organisation Hochschule zu übertragen.
Wir pr asentieren SQL Island, ein neuartiges browserbasiertes Lernspiel,welches auf dem Konzept der Text-Adventures basiert. Nach einem Flugzeugabsturzlandet die Spielfigur auf einer Insel. Man redet mit Bewohnern, sammelt Gegenst ̈andeund k ̈apft gegen B ̈osewichte. Die Besonderheit bei diesem Spiel ist jedoch, dass derSpieler seine Figur lediglich mittels SQL-Befehlen steuert. Alle n ̈otigen Befehle wer-den zun ̈achst pr ̈asentiert, sodass keine Vorerfahrung notwendig ist. Nach etwa einerStunde Spielzeit beherrscht der Spieler SELECT-, UPDATE- und DELETE-Anfragensowie Gruppierungen, Aggregationsfunktionen und Joins. Das Spiel kann online aufhttp://www.sql-island.de kostenlos und ohne Registrierung gespielt werden.
Nonlinear ill-posed problem analysis in model-based parameter estimation and experimental design
(2015)
Discrete ill-posed problems are often encountered in engineering applications. Still, their sound analysis is not yet common practice and difficulties arising in the determination of uncertain parameters are typically not assigned properly. This contribution provides a tutorial review on methods for identifiability analysis, regularization techniques and optimal experimental design. A guideline for the analysis and classification of nonlinear ill-posed problems to detect practical identifiability problems is given. Techniques for the regularization of experimental design problems resulting from ill-posed parameter estimations are discussed. Applications are presented for three different case studies of increasing complexity.
In-plant milk-run systems are transportation systems, where materials are delivered from a central storage area to several points of use on defined routes and in short intervals. Milk-run systems generally enable frequent deliveries in smaller lot sizes with short lead times and low inventory at the points of use. In real milk-run systems, the number of deliveries per interval varies due to, e.g., variations in the production program. To enable efficient and stable milk-run system operations, these systems therefore must be able to cope with peaks in the number of deliveries. We develop different strategies for handling these delivery peaks and evaluate them with respect to delivery cost, lead time and service level using real material consumption data from two large companies from the automotive industry.
In this report, the delivery of polyamide 12 (PA 12) powder and powder layer preparation by vibrating steel nozzles is investigated and discussed with respect to its application for laser beam melting. Therefore, a setup was realized which includes a steel nozzle attached to a piezo actor as well as a positioning system. In order t o investigate the mass flow characteristics in dependency on the applied vibration state, a weighing cell is used enabling time-resolved mass flow measurements. Moreover, single-layer patterns consisting of colored and uncolored polyamide 12 were created and characterized regarding surface homogeneity and selectivity before as well as after the melting of the powder layers by a hot plate.
By Laser Beam Melting of polymers (LBM), parts with almost any geometry can be built directly out of CAD files without the need for additional tools. Thus, prototypes or parts in small series production can be generated within short times. Up to now, no multi-material parts have been built by LBM, which is a major limitation of the technology. To realize multi-material parts, new mechanisms for depositing different polymer powders as well as a new irradiation strategy are needed, by which polymers with different melting temperatures can be warmed to their specific preheating temperatures and be molten simultaneously. This is achieved by simultaneous laser beam melting (SLBM). In the process, two different materials are deposited next to each other and preheated a few degrees below their melting temperatures by infrared emitters and laser radiation (λ = 10.60 µm), before in the last step the two preheated powders are molten simultaneously by an additional laser (λ = 1.94 µm). So far, multi-material tensile bars have been realized and analyzed regarding their boundary zone between both materials. The experiments showed that the temperature gradients in the boundary zone and along the building direction seem to be of great importance for the process stability and the resulting part properties. Therefore, a detailed analysis of the occurring temperature gradients during the process is needed to identify adequate process adjustments regarding the temperature controlling. To analyze the temperature gradients, thermocouples positioned inside the powder bed are used. By varying the temperature of the building platform, the influence of different temperature gradients on the resulting part properties is shown.
Robots should appropriately give reasons for their actions
when these actions affect a human’s action or goal space. Communicating reasons may help the human understand the robot’s intents and may initiate joint action, i. e., accepting the robot’s goals and cooperating on the robot’s actions. However, to be efficient, the communication of reasons should be limited to the necessary rather than to completeness, conforming to the Gricean Maxim of Quantity. Furthermore, what is necessary only becomes apparent as the situation evolves and hence, for seamless interaction, ongoing utterances must be adapted as they happen. We present a system that flexibly gives reasons in a reduced setting in which the robot needs to intrude a human’s personal space in order to reach its goal.
We propose to use a model of personal space to initiate communication while passing a human thereby acknowledging that humans are not just a special kind of obstacle to be avoided but potential interaction partners. As a simple form of interaction, our system communicates an apology while closely passing a human. To this end, we present a software architecture that integrates a social-spaces knowledge base and a component for incremental speech production. Incrementality ensures that the robot’s utterance can be adapted to fit the developing situation in a natural way. Observer ratings show that personal-space intrusion is perceived as both natural and polite if the robot has the capability to utter and adapt an apology in an incremental way whereas it is perceived as unfriendly if the robot intrudes personal space without saying anything. Moreover, the robot is perceived as less natural if it does not adapt.
Incremental speech synthesis aims at delivering the synthetic voice while the sentence is still being typed. One of the main challenges is the online estimation of the target prosody from a partial knowledge of the sentence's syntactic structure. In the context of HMM-based speech synthesis, this typically results in missing segmental and suprasegmental features, which describe the linguistic context of each phoneme. This study describes a voice training procedure which integrates explicitly a potential uncertainty on some contextual features. The proposed technique is compared to a baseline approach (previously published), which consists in substituting a missing contextual feature by a default value calculated on the training set. Both techniques were implemented in a HMM-based Text-To-Speech system for French, and compared using objective and perceptual measurements. Experimental results show that the proposed strategy outperforms the baseline technique for this language.
Im Rahmen dieses Beitrags wird das Schmelzeemulgieren als Verfahren zur Herstel-lung von Polymermikropartikeln vorgestellt. In diesem Prozess wird zunächst ein Polymergranulat in einer kontinuierlichen Phase in Gegenwart geeigneter Additive in einem Rührbehälter aufgeschmolzen, die Rohemulsion in einer Rotor-Stator-Einheit feinemulgiert und anschließend zu einer Suspension abgekühlt. Der Einfluss von Prozessparametern und Systemzusam-mensetzung auf das Emulgierergebnis wird diskutiert und die Anwendbarkeit des Verfahrens für polymere Mikropartikeln anhand von Polypropylen (PP) und Polyethylen (PE-HD) dargestellt. Die erhaltenen Suspensionen werden zur Überführung in Pulverform sprühgetrocknet und die Fließeigenschaften des Pulvers analysiert. Durch trockenes Beschichten mit pyrogener Kieselsäure kann die Fließfähigkeit der erhaltenen Partikeln weiter verbessert werden. Das Verfahren bietet somit einen neuen Zugang zur Herstellung neuer Ausgangsmaterialien für die Additive Fertigung.
By simultaneous laser beam melting (SLBM), parts consisting of different polymer powders can be additively manufactured within one building process. Besides the advantages of conventional LBM, e.g., not needing additional tools and being able to realize parts with almost any geometry, different product requirements can be achieved within a single part. Product requirements may be different chemical resistances or haptic material properties. Therefore, SLBM enlarges the application field for additive manufacturing in general. In the process, two different materials are deposited on the building platform and preheated a few degrees below the melting temperature of the lower melting polymer by infrared emitters. Afterward, a CO2 laser (λ = 10.6 μm) provides the energy for the temperature difference between the preheating temperatures of both materials. Finally, a digital light processing chip is used to achieve simultaneous and flexible energy deposition for melting both preheated polymers. By illuminating the chip with a laser, parts of the beam can be flexibly guided onto the powder bed or into a beam trap. As laser, a single mode thulium laser (λ = 1.94 μm) is used. After melting the layer, a new layer is deposited and the process starts anew. In this paper, polypropylene and polyamide 12 are used as materials. After analyzing the material and melting behavior during the process by a high-resolution thermal imaging system, the parts are qualified regarding their material compatibility at the boundary zone and porosity by cross sections.
In this paper, first results regarding the realization of multi-material parts by Simultaneous Laser Beam Melting (SLBM) of polymers are presented. This new approach allows the layerwise generation of parts consisting of different polymer materials within one building process. Besides the typical advantages of additive manufacturing technologies, such parts can fulfill different product requirements concomitant and therefore could enlarge the overall field of application. The powder materials used for this paper are polyethylene (PE) and a polyamide based thermoplastic elastomer (TPE). After depositing the powder materials next to each other, infrared-emitters heat the lower melting polymer and a CO2 laser provides the preheating temperature of the higher melting polymer. In the last step, a thulium fibre laser melts the two preheated powders simultaneously. The realized specimens are characterized by cross sections and their tensile strengths are determined. Additionally, the new approach of the simultaneous energy irradiation is investigated using a Finite Element Analysis in order to gain a more profound process understanding. In that sense, the influence of the size of the exposure area on the reachable maximum temperatures inside that area was analyzed by the simulation and compared to experimental studies.
In this paper, first results regarding the realization of multi-material parts by Simultaneous Laser Beam Melting (SLBM) of polymers are presented. This new approach allows the layerwise generation of parts consisting of different polymer materials within one building process. Besides the typical advantages of additive manufacturing technologies, such parts can fulfill different product requirements concomitant and therefore could enlarge the overall field of application. The powder materials used for this paper are polyethylene (PE) and a polyamide based thermoplastic elastomer (TPE). After depositing the powder materials next to each other, infrared-emitters heat the lower melting polymer and a CO2 laser provides the preheating temperature of the higher melting polymer. In the last step, a thulium fibre laser melts the two preheated powders simultaneously.The realized specimens are characterized by cross sections and their tensile strengths are determined. Additionally, the new approach of the simultaneous energy irradiation is investigated using a Finite Element Analysis in order to gain a more profound process understanding. In that sense, the influence of the size of the
exposure area on the reachable maximum temperatures inside that area was analyzed by the simulation and compared to experimental studies.
Purpose – The purpose of this paper is to demonstrate the processability of cohesive PE-HD particles in laser beam melting processes (LBM) of polymers. Furthermore, we present a characterization method for polymer particles, which can predict the quality of the powder deposition via LBM processes. Design/methodology/approach – This study focuses on the application of dry particle coating processes to increase flowability and bulk density of PE-HD particles. Both has been measured and afterwards validated via powder deposition of PE-HD particles in a LBM machine. Findings – For efficient coating in a dry particle coating process, the PE-HD particles and the attached nanoparticles need to show similar surface chemistry, i.e. both need to behave either hydrophobic or hydrophilic. It is demonstrated that dry particle coating is appropriate to enhance flowability and bulk density of PE-HD particles and hence considerably improves LBM processes and the resulting product quality. Originality/value – At present, in LBM processes mainly polyamide (PA), 12 particles are used, which are so far quite expensive in comparison to, for example, PE-HD particles. This work provides a unique and versatile method for nanoparticulate surface modification which may be applied to a wide variety of materials. After the coating, the particles are applicable for the LBM process. Our results provide a correlation between flowability and bulk density and the resulting product quality.
A method, system, and computer-usable non-transitory storage device for dynamic voice codec adaptation are disclosed. The voice codec adapts in real time to devote more bits to audio quality when it is most needed, and fewer bits to less important parts of utterances are disclosed. Dialog knowledge is utilized for compression opportunities to adjust the bitrate moment-by-moment, based on the inferred value of each frame. Frame importance and appropriate transmission fidelity is predicted based on prosodic features and models of dialog dynamics. This technique provides the same communications quality with less spectrum needs, fewer antennas, and less battery drain.
The Impact of the ObamaCare Excise Tax on Innovation and Entrepreneurship – Early Empirical Findings
(2015)
This study addresses aspects of governmental influence on innovation by analyzing the impact of the ObamaCare excise tax on the medical device industry. We initially give an overview of common approaches to measuring innovativeness and entrepreneurship, empirically assess whether existing metrics are suitable for investigating the innovation performance of the U.S. medical device industry, and define a new measure (firm innovation activity) for entrepreneurship. Then we perform a quantitative analysis to explore the impact of the tax. We analyze more than 60,000 product clearances from 1996 to 2013, using the FDA database. We find a significant relationship between product counts and revenues for one segment. Contrary to the present criticism of the excise tax, we find hardly any noteworthy response in either firm innovation activity or number of products launched in the year after the tax was introduced. The 2013 reduction of new product submissions is well within the limits of typical annual fluctuations observed in previous years. This provides a first indication that the excise tax act did not have a strong impact on innovative activities through the present.
Innovation diffusion points toward how innovations spread into the market after launch. This paper investigates diffusion dynamics at market entry time and proposes a new evolution pattern at the intersection between inventions and innovations. With this in mind, we initially prove that patent filings correlate with new product introductions in the U.S. spine market. Then we test our new theory supposing that certain patent filing threshold numbers accelerate strong economic returns in terms of innovations. We find that firms hitting certain patent filing thresholds significantly increase their product launches in the mentioned market. Moreover, the results seem to indicate that economic returns of inventions may be measured substantially. Thus, this paper suggests a new research area by utilizing our proposed concept about an Innovation Outcome Trigger Value (IOTV). Furthermore, the implications may also be interesting for practitioners, since we empirically prove that inventive activities turn out to be worthwhile, indeed.
Methods are considered for the indirect determination of the mobility of structure-borne sound sources. Instead of performing measurements on the source in the free state, the source mobility is obtained from measurements made in-situ. This approach is beneficial if the source is difficult to suspend, or if it contains nonlinear structural elements. Two formulations for an indirect source mobility are derived theoretically. The first requires measurement of velocities at or near to the contact points. The second involves measurement of remote velocities only. Neither of the methods requires excitation at the contacts in the coupled state. Numerical simulations of coupled beams are used to validate the two methods and investigate their accuracy and reliability with respect to typical measurement errors, such as background noise and inaccuracies in sensor positioning. It is found that these can have a significant effect on the methods considered. Several experimental case studies with single-contact and multi-contact sources are performed. The results confirm the validity of the two methods in principle, but highlight their sensitivity to experimental errors. In a representative case study with a fan unit, average errors range between ±5 dB and ±10 dB, with occasional errors of up to 30 dB.
We present a novel technique for rendering depth of field that addresses difficult overlap cases, such as close, but out-of-focus, geometry in the near-field. Such scene configurations are not managed well by state-of-the-art post-processing approaches since essential information is missing due to occlusion. Our proposed algorithm renders the scene from a single camera position and computes a layered image using a single pass by constructing per-pixel lists. These lists can be filtered progressively to generate differently blurred representations of the scene. We show how this structure can be exploited to generate depth of field in real-time, even in complicated scene constellations.
Rendering performance is an everlasting goal of computer graphics and significant driver for advances in both, hardware architecture and algorithms. Thereby, it has become possible to apply advanced computer graphics technology even in low-cost embedded appliances, such as car instruments. Yet, to come up with an efficient implementation, developers have to put enormous efforts into hardware/problem-specific tailoring, fine-tuning, and domain exploration, which requires profound expert knowledge. If a good solution has been found, there is a high probability that it does not work as well with other architectures or even the next hardware generation. Generative DSL-based approaches could mitigate these efforts and provide for an efficient exploration of algorithmic variants and hardware-specific tuning ideas. However, in vertically organized industries, such as automotive, suppliers are reluctant to introduce these techniques as they fear loss of control, high introduction costs, and additional constraints imposed by the OEM with respect to software and tool-chain certification. Moreover, suppliers do not want to share their generic solutions with the OEM, but only concrete instances. To this end, we propose a light-weight and incremental approach for meta programming of graphics applications. Our approach relies on an existing formulation of C-like languages that is amenable to meta programming, which we extend to become a lightweight language to combine algorithmic features. Our method provides a concise notation for meta programs and generates easily sharable output in the appropriate C-style target language.
With ever increasing ray traversal and hierarchy construction performance the application of ray tracing to problems often tackled by rasterization-based algorithms is becoming a viable alternative. This is especially desirable as the ground truth for these algorithms is often determined by using ray tracing and thus directly applying it is the simplest way to generate images satisfying the reference. In this paper we propose a very efficient pre-process to speed up the construction and traversal of sub-optimal, but fast-to-build hierarchies used for interactive ray tracing and show how it can be applied to shadow rays in a hybrid environment, where ray tracing is used to sample area lights for scene positions found and shaded via rasterization.
A common way to ray trace subdivision surfaces is by constructing and traversing spatial hierarchies on top of tessellated input primitives. Unfortunately, tessellating surfaces requires a substantial amount of memory storage, and involves significant construction and memory I/O costs. In this paper, we propose a lazy-build caching scheme to efficiently handle these problems while also exploiting the capabilities of today's many-core architectures. To this end, we lazily tessellate patches only when necessary, and utilize adaptive subdivision to efficiently evaluate the underlying surface representation. The core idea of our approach is a shared lazy evaluation cache, which triggers and maintains the surface tessellation. We combine our caching scheme with SIMD-optimized subdivision primitive evaluation and fast hierarchy construction over the tessellated surface. This allows us to achieve high ray tracing performance in complex scenes, outperforming the state of the art while requiring only a fraction of the memory. In addition, our method stays within a fixed memory budget regardless of the tessellation level, which is essential for many applications such as movie production rendering. Beyond the results of this paper, we have integrated our method into Embree, an open source ray tracing framework, thus making interactive ray tracing of subdivision surfaces publicly available.
In this paper, we introduce a novel technique for pre-filtering multi-layer shadow maps. The occluders in the scene are stored as variable-length lists of fragments for each texel. We show how this representation can be filtered by progressively merging these lists. In contrast to previous pre-filtering techniques, our method better captures the distribution of depth values, resulting in a much higher shadow quality for overlapping occluders and occluders with different depths. The pre-filtered maps are generated and evaluated directly on the GPU, and provide efficient queries for shadow tests with arbitrary filter sizes. Accurate soft shadows are rendered in real-time even for complex scenes and difficult setups. Our results demonstrate that our pre-filtered maps are general and particularly scalable.
Die neuen deutschen S3-Leitlinien „Screening, Diagnose und Behandlung alkoholbezogener Störungen“ (AWMF 076001) enthalten eine Reihe von Hinweisen auf geschlechtsspezifische Differenzen hinsichtlich des Screenings, der Diagnostik und vor allem der Behandlung alkoholbezogener Störungen. In einem Unterkapitel werden die Ergebnisse einer breit angelegten Literatursuche nach (randomisierten) kontrollierten Studien zusammengefasst mit Frauen als Studiensubjekte. Zur Behandlung von schwangeren Frauen mit leichten bis mittleren alkoholbezogenen Störungen empfehlen sich Kurzinterventionen. Für Schwangere mit schweren alkoholbezogenen Störungen sollte eine stationäre Behandlung angestrebt werden bzw. eine Begleitung durch die Schwangerschaft und die Zeit nach der Geburt durch Hausbesuche durch eine Fachkraft. Kurzeitinterventionen haben sich auch bei nicht-schwangeren Frauen mit leichten bis mittleren alkoholbezogenen Störungen bewährt. Frauen, bei denen eine Posttraumatische Belastungsstörungen sowie eine Substanzkonsum-Störung diagnostiziert worden ist, sollten eine integrierte Behandlung erhalten (wie z. B. im Programm „Sicherheit finden“). In gewissem Umfang bewährt haben sich kognitive Paartherapien sowie ambulante oder stationäre Settings mit Frauengruppen. Das gilt insbesondere für Frauen, die eine Paarbehandlung oder eine Behandlung in einer Frauengruppe wünschen. Mehr und bessere Forschungen sind dringend notwendig, um die Angebote für Frauen mit Substanzkonsum-Störungen zu verbessern.
Verbesserung der Praxistauglichkeit der Baunormen durch pränormative Arbeit - Teilantrag 2: Betonbau
(2015)
"Wie sehen Sie das?"
(2015)
Ist Inklusion messbar?
(2015)
Für die Aufbereitung und Systematisierung wissenschaftlicher Forschungsergebnisse und die Beantwortung klinischer Fragestellungen im Hinblick auf Entscheidungen in der alltäglichen Versorgungspraxis bietet die Forschungspyramide ein Modell zur Bewertung und Integration externer Evidenz aus unterschiedlichen Forschungsansätzen. Dieser Beitrag bietet ein Update zu den neueren Entwicklungen und dem aktuellen Stand des Modells sowie den Grundlagen eines Pyramidenreviews.
Die Gesundheitsversorgung kommt ohne die Eigenverantwortung, Selbsthilfe und aktive Mitarbeit der Patient(inn)en nicht aus. Welche Mechanismen in Selbsthilfegruppen wirken, wurde bisher nicht im Einzelnen wissenschaftlich untersucht. Im Zusammenhang mit dem gesicherten Wissen über die salutogenen und tertiärpräventiven Wirkungsweisen von Kommunikation und sozialen Bindungen kann aus dem Stand der Selbsthilfeforschung jedoch ein allgemeines heuristisches Modell der Wirkung von Selbsthilfegruppen abgeleitet werden, welches kurz erläutert wird.
Alkoholbezogene Krankheiten haben in Deutschland eine besondere sozioökonomische Bedeutung. Menschen mit Alkoholabhängigkeit werden mithilfe multimodaler Therapiekonzepte (ambulante, teilstationäre oder stationäre Interventionen) und vor allem durch die vielfältigen Angebote der Suchtselbsthilfe unterstützt.
Vor allem Selbsthilfegruppen tragen zur gegenseitigen sozialen Unterstützung, zur Informationsaneignung und zu Einstellungsänderungen bei den Betroffenen und in deren sozialem Umfeld bei. Sie entlasten die primären Netzwerke der Betroffenen und unterstützen bei der gezielteren Inanspruchnahme professioneller Dienste. Sind die Ergebnisse und Schlussfolgerungen der Studien zur Wirksamkeit von Selbsthilfe bei Alkoholabhängigkeit aufgrund methodischer Schwierigkeiten der Wirkungsforschung auch limitiert, so sind sich doch die meisten Forscher einig, dass die Unterstützung durch Selbsthilfegruppen allein und in Kombination mit einer professionellen Versorgung das Erreichen und Erhalten der Alkoholabstinenz unterstützt.
Selbsthilfeaktivitäten werden zunehmend durch das Sozialversicherungssystem gefördert und in das Versorgungssystem integriert. Das Ungleichgewicht in der Kooperation zwischen Selbsthilfe und professionellem System kann durch ein Kennenlernen der regionalen Selbsthilfelandschaft mit ihren Strukturen, Unterstützungsleistungen und Besonderheiten ausgeglichen werden.
Dieser Beitrag verringert durch die wissenschaftlich fundierte Darstellung der (Sucht-)Selbsthilfe mögliche Handlungsunsicherheiten im Umgang mit Selbsthilfegruppen in der täglichen Praxis.
Der vorliegende Band gibt einen Überblick über den Stand der quantitativen Forschung über physiotherapeutische Interventionen bei Schmerzen des unteren Rückens. Die Evidenz aus beobachtenden und experimentellen Studien über die Effektivität hinsichtlich der Erwerbstätigkeit wird in konkrete Behandlungsempfehlungen überführt. Die Entwicklung und Anwendung des zu diesem Zweck auf Basis der Forschungspyramide entwickelten Systems zur paritätischen Bewertung von Studien aus beiden Ansätzen wird veranschaulicht.
Die Gesundheitsversorgung kommt ohne die Eigenverantwortung, Selbsthilfe und aktive Mitarbeit der Patient(inn)en nicht aus. Welche Mechanismen in Selbsthilfegruppen wirken, wurde bisher nicht im Einzelnen wissenschaftlich untersucht. Im Zusammenhang mit dem gesicherten Wissen über die salutogenen und tertiärpräventiven Wirkungsweisen von Kommunikation und sozialen Bindungen kann aus dem Stand der Selbsthilfeforschung jedoch ein allgemeines heuristisches Modell der Wirkung von Selbsthilfegruppen abgeleitet werden, welches kurz erläutert wird.
Alkoholbezogene Krankheiten haben in Deutschland eine besondere sozioökonomische Bedeutung. Menschen mit Alkoholabhängigkeit werden mithilfe multimodaler Therapiekonzepte (ambulante, teilstationäre oder stationäre Interventionen) und vor allem durch die vielfältigen Angebote der Suchtselbsthilfe unterstützt.
Vor allem Selbsthilfegruppen tragen zur gegenseitigen sozialen Unterstützung, zur Informationsaneignung und zu Einstellungsänderungen bei den Betroffenen und in deren sozialem Umfeld bei. Sie entlasten die primären Netzwerke der Betroffenen und unterstützen bei der gezielteren Inanspruchnahme professioneller Dienste. Sind die Ergebnisse und Schlussfolgerungen der Studien zur Wirksamkeit von Selbsthilfe bei Alkoholabhängigkeit aufgrund methodischer Schwierigkeiten der Wirkungsforschung auch limitiert, so sind sich doch die meisten Forscher einig, dass die Unterstützung durch Selbsthilfegruppen allein und in Kombination mit einer professionellen Versorgung das Erreichen und Erhalten der Alkoholabstinenz unterstützt.
Selbsthilfeaktivitäten werden zunehmend durch das Sozialversicherungssystem gefördert und in das Versorgungssystem integriert. Das Ungleichgewicht in der Kooperation zwischen Selbsthilfe und professionellem System kann durch ein Kennenlernen der regionalen Selbsthilfelandschaft mit ihren Strukturen, Unterstützungsleistungen und Besonderheiten ausgeglichen werden.
Dieser Beitrag verringert durch die wissenschaftlich fundierte Darstellung der (Sucht-)Selbsthilfe mögliche Handlungsunsicherheiten im Umgang mit Selbsthilfegruppen in der täglichen Praxis.
In order to come to evidence-based clinical decisions speech language therapists need to draw on specific knowledge, skills and professional attitudes. The TÜBEP-ST 1.0 introduced here is an assessment tool focussing on the evaluation of basic competencies in evidence-based decision making for german speech language therapy students. The test allows to determine students‘ level of proficiancy concerning fundamen- tal concepts, definitions and practical skills in evidence-based practice. The TÜBEP-ST is derived from the „FRESNO-Test of Competence in Evidence-Based Practice”, a well-established, validated measurement for the basic steps of evidence-based medicine.
The characterization of oxygen diffusion zone in titanium and the effect of this zone on macroscopic properties are still of high interest for a base to predict and to enhance life time of titanium and titanium alloy components. The aim of this study was to contribute to the understanding of the impact of oxygen on fatigue properties of oxygen diffusion hardened Ti and Ti alloys. Oxygen diffusion hardening implies two process steps, first the oxidation of the surface and secondly the diffusion of oxygen into metal matrix. Due to the one-step treatment used in this study the oxidation step could take place easily avoiding scaling and grain boundary diffusion. In spite of this precaution, the fatigue properties in the present study were found to be decreased after the performed oxygen diffusion hardening. The reason for the reduction of mechanical properties were claimed to be oxide clusters on the surface acting as crack initiation sites. Comparison and discussion with literature revealed varying partially contradictory fatigue results. Therefore precise analysis of the fatigue failure is necessary as a base for further development of the oxygen diffusion hardening.