Book Volume 2
Page: i-i (1)
Author: Ajit Kumar Verma
Page: ii-ii (1)
Author: Om Prakash Yadav
Page: iii-iii (1)
Author: Mangey Ram
Page: iv-iv (1)
Author: Mangey Ram
Page: v-vi (2)
Author: Mangey Ram
Page: 1-33 (33)
Author: Tadashi Dohi, Lin Zhu Jin, Junjun Zheng and Hiroyuki Okamura
For large scale software systems with a huge number of source codes, it may be often useful to regard approximately software fault-counting processes observed in testing phase as continuous-state stochastic processes. In this chapter we introduce stochastic non-counting process models to describe the fault-detection phenomena in software testing. The time-nonhomogeneous Gaussian process and time-nonhomogeneous gamma process-based software reliability models (SRMs) are summarized, and are compared with the existing SRMs such as the geometric Brownian motion and nonhomogeneous Poisson processes (NHPPs). It is shown in numerical examples with actual software development project data that the timenonhomogeneous gamma process-based SRMs could provide the better goodnessof- fit and predictive performances than the existing SRMs in many cases.
Development of Software Reliability Growth Models with Time Lag and Change-Point and a New Perspective for Release Time Problem
Page: 34-52 (19)
Author: Avinash K. Shrivastava and Parmod K. Kapur
Over the past three decades, many software reliability growth models (SRGMs) have been proposed and used to predict and estimate software reliability. One common assumption of these conventional SRGMs is to assume that detected faults will be removed immediately. In reality, this assumption may not be reasonable and may not always occur. During debugging, developers need time to reproduce the failure, identify the root causes of faults, fix them, and then re-run the software. In this chapter, we proposed a fault dependency model considering the fault removal phenomenon as a three-stage process of removing leading, dependent and additional faults in software. Leading faults are those which can be removed upon failure, but dependent faults are masked by leading faults and can only be removed after the corresponding leading faults have been removed. Additional faults are those which are removed during the removal process of dependent fault. These additional faults can’t be removed directly as they are not identified for causing failure. During the removal process of dependent faults testing team detect and remove them from the code. Also during testing, the fault correction rate may not be constant due to change in the testing strategies; experience and skills as time proceeds. The change in detection/correction rate is known as change-point. We have incorporated the concept of change-point to model a new lag function based SRGM. Further in the existing software reliability literature the cost of model framework focuses on the detection or correction process only to calculate the release time and reliability of the software and ignoring the impact of either detection or correction process. As both detection and correction have significant impact on release and reliability prediction hence either of them cannot be neglected. In this chapter we provide a new perspective to the release time problem incorporating both detection and correction costs. Validation of the above models is done on real life software failure data sets of Tandem Computers and a numerical example is also given to illustrate the significance of new release policy.
Stochastic Differential Equation Based Formulation for Multiple Software Release Considering Fault Detection and Correction Process
Page: 53-71 (19)
Author: Adarsh Anand, Deepika, Ompal Singh and Parmod K. Kapur
Fault detection and correction are two different and yet important activities for software developers. They both should go hand in hand for a good debugging process. The two concepts have been mathematically studied in depth and a lot seems to be in the pipeline. In the present framework, we have developed some software reliability growth models (SRGMs) that inculcate the concept of software in multi versions. The methodical approach of stochastic differential equation has been used to model the scenario that unifiedly takes care of fault detection and correction process. The proposed models are validated on Tandem data set that comprises of four successive software releases and the results obtained and authenticated by weighted criteria technique are promising.
Page: 72-107 (36)
Author: Hemant K. Verma, Vinay Verma and Nitin Khanna
With rapid technological advancements in the last few decades, our information and communication systems are progressively moving towards being completely digitized. Simultaneously, as a consequence of modern networking technologies, a plethora of digital information is being stored and transmitted in a couple of seconds. Highly appreciated multimedia enabled handheld devices are empowering even the common man to communicate on the go. Today, our understanding of events is highly dependent on visual information. Consequently, digital images have become a prominent carrier of information. Dependence of our lifestyles on digital images leads to the critical importance of their security and reliability. In parallel to the technological advancement, availability of software tools capable of easily manipulating digital images in a user-friendly manner is not so rare. With this, the trust we keep on digital image as proof of an event is at risk. This chapter presents an introduction to the emerging field of digital image forensics, which aims to provide authenticity and security to digital images. This introduction is followed by detailed analysis of one class of image forensic methods, methods based on color filter array interpolation. An experimental comparison of some prominent works and recent developments in this field on a common dataset has been discussed in this chapter.
Page: 108-122 (15)
Author: Yogesh K. Sharma, Sachin K. Mangla, Pravin P. Patil, Alok K. Yadav, Suresh K. Jakhar and Sunil Luthra
Food safety and security is a growing research area due to increasing demand of food across the globe. Customers now-a-days are more conscious about their health. In the past, many cases regarding food containment and adulteration raised the concern about safety and security of the food supply companies. To minimize these concerns, IT based technologies can play an important role. In the present chapter, nine such IT based technologies were identified through review of the extant literature. Furthermore, the analytic hierarchy process (AHP) approach is used to rank the identified IT technologies based on their priority. Sensitivity analysis is also performed to check the robustness of the model. Radio frequency identification technology holds the first rank on the priority list.
Page: 123-167 (45)
Author: Ali Muhammad Rushdi and Muhammad Ali Rushdi
We describe the mathematical steps and the main features of the Modern Syllogistic Method (MSM), which is a relatively recent technique of deductive inference in propositional logic. This method ferrets out from a set of premises all that can be concluded from it, with the resulting conclusions cast in the simplest or most compact form. We demonstrate the applicability of the method in a variety of problems via eight examples that illustrate its mathematical details and exhibit the nature of the truth-preserving conclusions it can come up with. The method is shown to be particularly useful for detecting inconsistency within a set of given premises or hypotheses and it helps its user in penetrating to the heart of the problem and in confronting fallacious or fallacy-based argumentation. The method is also demonstrated to yield fruitful results that can aid in exploring, and maybe resolving, complex problems such as ethical dilemmas. The method is also shown to have a prominent application in the analysis of relational databases, wherein it offers a variety of algorithms for deriving the closure of a set of functional dependencies and the set of all candidate keys. Finally, several potential extensions and new applications of the method are outlined.
Page: 168-196 (29)
Author: Rishi Prakash and Dharmendra Singh
In this chapter, we will discuss the mathematical formulation of scattering in the bistatic domain. The scattering phenomenon is associated with the scattering from the soil surface. The major domain of the discussion is in the light of radar remote sensing which deals with the characterization of soil parameters with the help of mathematical modeling, empirical equations and optimization techniques. This chapter will provide an idea of scattering in the bistatic domain, scatterometer setup and retrieval methodology of various soil parameters. Soil parameters which are sensitive for radar scattering are soil texture, soil moisture and surface roughness. Information of these soil parameters is very important in various applications, such as agriculture, weather forecasting, soil erosion studies, hydrological studies and many more. Scattering from soil surface depends on two major parameters, the first one is sensor parameters and another is soil parameters. Sensor parameters are frequency, incidence angle and polarization. Role of incidence angle and polarization on scattering mechanism has been dealt in detail and it has been shown that they play a major role in the retrieval of soil parameters. The sensitivity of different soil parameters for scattering coefficient has been established with experimental observations. Kirchhoff Scalar Approximation is a theoretical approach which provides mathematical expressions for scattering coefficient based on well established electromagnetic wave theory. A soil parameter retrieval methodology relying on theoretical approach along with the experimental observation has been discussed.
Multi-Objective Non-Linear Programming Problem for Reliability Optimization in Intuitionistic Fuzzy Environment
Page: 197-229 (33)
Author: Harish Garg
In this chapter, multi-objective reliability-cost optimization problems have been investigated by utilizing uncertain, vague and imprecise information. During the formulation, a reliability of each component of the system is represented in the form of the triangular interval. The conflicting nature of the objectives is resolved with the help of intuitionistic fuzzy programming technique by recognizing the linear, as well as non-linear membership functions. A crisp model is formulated by using a product aggregation operator to aggregate their expected values. The resultant problem is solved with a gravitational search algorithm (GSA) and compared their results with the particle swarm optimization (PSO) and genetic algorithm (GA). Results are validated through a statistical simulation of the t-test.
Page: 230-264 (35)
Author: Kanchan Das
Importance of quality in the overall supply chain management is paramount. In the current business world ensured quality product and services are the preconditions for their marketability. As such quality is the most crucial factor that will decide success and failure of the supply chain organization. Based on these facts quality should be planned for each supply chain function at the design stage. This research defines and formulates quality metrics for supply, product design and manufacturing, and customer process management with the objective of obtaining superior performance of a business system by integrating quality system of each function in the overall supply chain design and planning process. The research follows systematic steps in defining and formulating metrics for each function; and integration of the functional quality criteria in the overall supply chain design and planning model. A numerical example finally illustrates applicability of the entire approach and the models.
Identification of Zonal-Wise Passenger’s Issues in Indian Railways Using Latent Dirichlet Allocation (LDA): A Sentiment Analysis Approach On Tweets
Page: 265-276 (12)
Author: Vijay Singh, Mangey Ram and Bhasker Pant
Twitter is one of the effective mediums to detect the feeling of a mass. Due to the increasing penetration of this kind of social services in the society, its relevance and the credibility are also increasing. In this article, we did an analysis of 16 different Indian Railways Zonal Regions and Zonal-wise passengers issues concerning traveling, using Latent Dirichlet Allocation (LDA). The results generated by the LDA shows that the major concerns of the passengers are the sanitation, security of women, rat and bad behavior by the other passengers. These results can be used to further improve the performance of the Indian Railways and in decision making.
Page: 277-282 (6)
Author: Mangey Ram
Recent developments in information science and technology have been possible due to original and timely research contributions containing new results in various fields of applied mathematics. It is also true that advances in information science create opportunities for developing mathematical models further.