Volume 17 - 2022
Volume 16 - 2021
Volume 15 - 2020
Volume 14 - 2019
Volume 13 - 2018
Volume 12 - 2017
Volume 11 - 2016
Volume 10 - 2015
Volume 9 - 2014
Volume 8 - 2013
Volume 7 - 2012
Volume 6 - 2011
Volume 5 - 2010
Volume 4 - 2009
Volume 3 - 2008
Volume 2 - 2007
Volume 1 - 2006
JAQM Volume 1, Issue 2 - December 30, 2006
Interdisciplinarity - New Approaches and Perspectives in the Use of Quantitative Methods
Interdisciplinarity is considered the best way for research. This paper presents the interdisciplinary concept, and defines the necessary and sufficient conditions for developing an efficient interdisciplinary research. There are underlined the needed elements to build correct, concise, and consistent interfaces that assure the continuity, in order to develop a common research language. It is, also, developed an interdisciplinary model, based on a collaborative structure.
The building is important strut industry of our national economy. But the safety accidents in building domain occur frequently every year. The safety problem of building directly influences or even restricts the development of building industry. Support Vector Machine (SVM) is a kind of new machine learning method developed on the basis of statistical learning theory. This method based on the principle of structural risk minimization can solve the problem of overfitting effectively and has good generality capability and better classification accuracy. In this paper, the author applied this method in safety management of building and carried on the study of early warning system of building safety based on SVM. At last, we carried on the data experiment to this problem and proved that the method of SVM has good generality capability.
Nowadays, the Internet comprises of huge amount of electronic information concerning different companies' financial performance. This amount greatly exceeds our capacity to analyze it, the problem being that we often lack tools to quickly and accurately process these data. DM techniques are interesting mechanisms that can be applied to rapidly changing industries, in order to get an overview of the situation. One such market is the international telecommunications industry. In this paper we construct a framework using DM techniques that enables us to make class predictions about telecommunication companies' financial performance. Our methodology allows us to analyze the movements of the largest telecommunications companies, to see how companies perform financially compared to their competitors, what they are good at, who are the major competitors in this industry, etc.
Managing the security of enterprise information systems has become a critical issue in the era of Internet economy. As any other process, security can not be managed, if it can not be measured. The need for metrics is important for assessing the current security status, to develop operational best practices and also for guiding future security research. The topic is important at a time when companies are coming under increasing compliance pressures that require them to demonstrate due diligence when protecting their data assets. Metrics give companies a way to prioritize threats and vulnerabilities and the risks they pose to enterprise information assets based on a quantitative or qualitative measure. This paper presents a framework for ranking vulnerabilities in a consistent fashion, and some operational metrics used by large enterprises in managing their information systems security process.
In this work, an application of the modified minsadbed (minimizing sum of absolute differences between deviations) approach for a fuzzy environment is given. This type of regression was used for a statistical model with two real parameters and experimental observations which implies real numbers (see Arthanary and Dodge). We develop minsadbed to minsadbesd (minimizing sum of absolute differences between squared deviations) which is more suitable for our model on vague sets.
It's very important for the government to let the public correctly understand the current conditions and guide them to create a favorable atmosphere for solution to the incident through media. Therefore, media are indispensable in the process of emergency management. But media is widely viewed as biased. We investigate the news release when an emergency incident happened based on game mode with assumptions that audiences prefer information consistent with their beliefs, and that media often slant stories toward these beliefs. We show that, competition cannot reduce bias but the price, and reader heterogeneity is more important for accuracy in media than competition.
This paper represents a part of the author's contribution to the project "The Rehabilitation of Large Housing Estates in Romania" developed under the auspices of the National Council for Higher Education Scientific Research. It addresses the relationship between housing policy and local development policy mainly from an institutional and legislative perspective, focusing on the actors involved in supporting housing and urban renewal actions in Romania. The role of local public administration is particularly envisaged, considering the authority of city councils with regard to rehabilitation of apartment block areas and, in a wider context, to urban regeneration. Case studies on two Romanian cities will be presented in order to reveal not only current opportunities but also a series of drawbacks in this process.
Recent economic and technological developments have led to a growing international demand for highly skilled human resources. The increased competition for human capital has determined numerous OECD countries to take special measures for attracting and retaining human capital in such fields as: information technology, biotechnology, nanotechnology, health care, etc. These measures have stimulated the emigration of highly skilled professionals, especially from less developed to more developed economies. In this international context, in the last decade, Romanian and other Eastern European people with an academic background have had a significant propensity towards emigration.
Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's quality. These models often include different measures and on their basis it is established a degree to which the product satisfy each quality attribute. Each model can have a different set of attributes at the highest level of classification, and also, the attributes can be defined differently at all levels. Actually, more and more activities are based on computer programs and they become highly dependent on their quality. In principle, everyone agrees that quality is important, but few agree on what quality is. In this paper, we will present the most important models and standards for measuring software quality.
Book Review on
PhD Thesis Review on