The article is a follow-up and an extension to previously published papers by HolzerŻelażewska & Holzer (1997) and Holzer-Żelażewska & Tymicki (2009). Fristly, we have added new cohorts to the cohort analysis based on the individual data from births registration for the years 2009–2015. Secondly, we have extended the scope of the study by taking into account the context of postponement and recuperation to analyses of cohort fertility of Polish women.
The approach applied to the fertility postponement and recuperation on the cohort data refers to the method which was originally proposed by Frejka (2011) and Lesthaeghe (2001) and further developed by Sobotka et al. (Sobotka et al., 2011). This method allows for calculation of fertility postponement and recuperation measures with respect to a benchmark cohort chosen as the one that first experiences an onset of the increase in the mean age of motherhood at first birth.
The results show the remarkable changes in the fertility patterns in Poland. The main driving forces behind the change in fertility patterns in Poland are related to the postponement of first births along with a relatively good recuperation. The magnitude of recuperation for Polish cohorts dropped significantly for second births and was almost non-existent for third and higher births. Therefore, the pattern of fertility in Poland observed till 2015 could be characterized by postponement and recuperation of first births along with a significant decrease in second births with perpetual postponement of third and higher births.
The main objective of this work was to present a successful stabilization action of a building structure in an active landslide. Firstly, history of the case and a FEM simulation explaining ensuing situation are presented. Then different structural measures to stabilize the whole system are discussed. The structural solution of the problem (pile system reaching solid rocky zone) is presented in more detailed way. The estimation of forces acting on the structure, caused by an unstable soil mass, being crucial for the design of stabilizing structure is described.
Performance measurement system in supply chain management (SCM) has been receiving increasing
attention by business organizations as a way to evaluate efficiency in supply chain
activities. Assessing the performance of supply chain uncovers the gap between planning
and actual performance as to trace the potential problems thus ascertain necessary areas
for improvement. This research aims to investigate the application of performance measurement
system in SCM as well as exploring its relationship with organization’s performance
among Malaysian manufacturing firms. By utilizing the questionnaire method, respondents
involved were requested to indicate the extent to which they use a number of 24 selected
performance measures that are related to SCM. The results show that the majority of the
observed manufacturing firms utilize specific performance measurement tools in evaluating
the supply chain performance. The current performance measurement techniques, the Balanced
Score Card is adopted by around a quarter of the total responding firms followed
by Supply Chain Operations References Model – SCOR, which attracts total users of only
a fifth of the total respondents. In particular, performance measures under customer service
category recorded the highest number of usage followed by cost-based performance measures
and operations management. The results of this investigation also unveil few major points
that are important to be highlighted. Firstly, the obtained outcomes of this study bring to
light the significant relationships between the utilization of supply chain performance measures
under customer service, operations management and organizational performance. In
addition, this study discovered a significant correlation between the size of the organization
and the extent of use of supply chain performance measures and how these two variables
positively correlated. Lastly, the findings also suggested that the performance measures for
SCM has been playing a crucial role in enhancing the performance of the organizations and
is increasingly operated as the firms grow in size. Based on the brief highlighted points listed
above, it is not an exaggeration to say that this research contributes new information to the
body of knowledge in performance measurement system in SCM and its associations with
organizational performance.
To reliably calibrate suitable partial safety factors, useful for the specification of global condition describing structural safety level in considered design case, usually the evaluation of adequate failure probability is necessary. In accidental fire situation, not only probability of the collapse of load-bearing structure, but also another probability related to the people staying in a building at the moment of fire occurence should be assessed. Those values are different one from another in qualitative sense but they are coupled because they are determined by similar factors. The first one is the conditional probability with the condition that fire has already occured, whereas the second is the probability of failure in case of a potential fire, which can take place in the examined building compartment, but its ignition has not yet appeared. An engineering approach to estimate such both probabilities is presented and widely discussed in the article.
High movements of asset prices constitute intrinsic elements of financial crises. There is a common agreement that extreme events are responsible for that. Making inference about the risk spillover and its effect on markets one should use such methods and tools that can fit properly for catastrophic events. In the paper Extreme Value Theory (EVT) invented particularly for modelling extreme events was used. The purpose of the paper is to model risky assets using EVT and to analyse the transfer of risk across the financial markets all over the world using the Granger causality in risk test. The concept of testing in causality in risk was extended to Spectral Risk Measure i.e., respective hypotheses were constructed and checked by simulation. The attention is concentrated on the Chinese financial processes and their relations with those in the rest of the globe. The original idea of the Granger causality in risk assumes usage of Value at Risk as a risk measure. We extended the scope of application of the test to Expected Shortfall and Spectral Risk Measure. The empirical results exhibit very interesting dependencies.
Independent Component Analysis (ICA) can be used for single channel audio separation, if a mixed signal is transformed into time-frequency domain and the resulting matrix of magnitude coefficients is processed by ICA. Previous works used only frequency (spectral) vectors and Kullback-Leibler distance measure for this task. New decomposition bases are proposed: time vectors and time-frequency components. The applicability of several different measures of distance of components are analysed. An algorithm for clustering of components is presented. It was tested on mixes of two and three sounds. The perceptual quality of separation obtained with the measures of distance proposed was evaluated by listening tests, indicating "beta" and "correlation" measures as the most appropriate. The "Euclidean" distance is shown to be appropriate for sounds with varying amplitudes. The perceptual effect of the amount of variance used was also evaluated.
In the age of Information and Communication Technology (ICT), Web and the Internet have changed significantly the way applications are developed, deployed and used. One of recent trends is modern design of web-applications based on SOA. This process is based on the composition of existing web services into a single scenario from the point of view of a particular user or client. This allows IT companies to shorten the product-time to market process. On the other hand, it raises questions about the quality of the application, trade-offs between quality factors and attributes and measurements of these. Services are usually hosted and executed in an environment managed by its provider that assures the quality attributes such as availability or throughput. Therefore, in this paper an attempt has been made to perform quality measurements towards the creation of efficient, dependable and user-oriented Web applications. First, the process of designing service-based applications is described. Next, metrics for subsequent measurements of efficiency, dependability and usability of distributed applications are presented. These metrics will assess the efforts and trade-offs in a Web-based application development. As examples, we describe a pair of multimedia applications which we have developed in our department and executed in a cluster-based environment. One of them runs in the BeesyCluster middleware and the second one in the Kaskada platform. For these applications we present results of measurements and conclude about relations between quality attributes in the presented application development model. This knowledge can be used to reason about such relations for new similar applications and be used in rapid and quality development of the latter.
Redundancy based methods are proactive scheduling methods for solving the Project
Scheduling Problem (PSP) with non-deterministic activities duration. The fundamental
strategy of these methods is to estimate the activities duration by adding extra time to the
original duration. The extra time allows to consider the risks that may affect the activities
durations and to reduce the number of adjustments to the baseline generated for the project.
In this article, four methods based on redundancies were proposed and compared from two
robustness indicators. These indicators were calculated after running a simulation process.
On the other hand, linear programming was applied as the solution technique to generate
the baselines of 480 projects analyzed. Finally, the results obtained allowed to identify the
most adequate method to solve the PSP with probabilistic activity duration and generate
robust baselines.
Units of measurement appear as media of social confl ict in Witold Kula’s seminal study on metrication. Given the current discussions around political epistemology, Kula’s treatment of metrology is telling. He turns the supposedly neutral auxiliary science of weights and measures into a matter of concern. The reception of his concepts in the West is outlined (history of historical metrology, the Annales school, and the history of science), and the potential of this social history of measurement in times of accelerated data production is evaluated.
The paper presents a method of measuring deformations of cylindrical samples on the testing machine for free tube hydroforming experiments. During experiments a sample made of a thin-walled metal tube is expanded by the internal pressure of the working liquid and additionally subjected to axial compression. This results in a considerable circumferential deformation of the tube and its shortening. Analysis of the load cases and their impact on the deformations can be helpful in determining e.g. tube material properties or general limiting conditions in the tube hydroforming process. In connection with the above, the value of deformations and knowledge of their course during experiment has become one of the most important problems related to the issue described above.
Robotic total stations are a group of surveying instruments that can be used to measure moving prisms. These devices can generate significant errors during kinematic surveys. This is due to the different speeds of the total station’s measurement subsystems, which results in the observations of the point location being performed in different places of the space. Total stations which are several years old may generate errors of up to a few dozen centimeters. More modern designs, with much lower delays of the mechanical and electronic subsystems, theoretically allow to significantly reduce the values of the errors. This study involved the performance of kinematic tests on the modern robotic total station Leica MS50 in order to determine the values of measurement errors, and also to define the possibility of using them for the above-mentioned applications.
The results and method of measurements of D, H and T carried out at Hornsund in the summer of 1979 are presented. The relative and absolute values of these elements are given in reduction to the Polish magnetic station at Hornsund. An initial evaluation of changes in the magnetic field from 1957 to 1979 is carried out.
This paper presents the results of magnetic mapping carried out in the area of the metamorphic series of Ariekammen and Skoddefjellet. On the basis of qualitative interpretation of measurements a number of anomalous zones were distinguished, whose position can be correlated with local changes in mineralitation and polymetallic ore content in the Fuglebergsletta area. The SE-NW orientation, skew to the almost meridional run of the layers of slates and marbles making up the metamorphic complex, dominates in the course of the anomalous zones.
It was demonstrated that in the fishes of the species Trematomus bernacchi, predominant in the regions of the USSR Antarctic Station Mirny (Davis Sea), body proportions changes along with the growth of these specimens. Measurements include 20 plastic features in 171 fishess (total length 110.2—265.0 mm). Statistically significant variations of eleven proportions of the body were stated during the growth of the fishes. Five other proportions changed in a degree of little statistical significance, whereas the last three of the body proportions did not change at all.
This paper presents the way in which temperature is measured in tests concerning structural transformations in various types of steel under welding conditions. In the test methodology, a small-sized steel specimen was subjected to simulated welding thermal cycles, during which the temperature of the specimen, changes in magnetic permeability and thermal expansion were measured simultaneously. The measurements of those parameters required the non-contact heating of the specimen, which involved the use of heating lamps. The temperature measurement was of key importance because the subsequent analysis of the remaining parameters was performed in the function of temperature.
The tests of structural transformations resulted in the development of Continuous Cooling Transformation under welding conditions (CCT) diagrams, enabling the determination of steel weldability and constituting the source of information needed to determine the effect of welding thermal cycles on the structure and properties of the material subjected to the tests.
Related numerical models to be used as the basis for the analysis of temperature distribution in the test specimen have been developed. These tests involved the analysis of the values and the distribution of temperature in relation to various model parameters, i.e. thermocouple types, geometrical features of a thermocouple junction and the diameter of thermocouple wires. The results of FEM calculations have been compared to the experiments.