Transient Stability Analysis of Multi-Machine System Equipped with Hybrid Power Flow Controller
Lini Mathew, S. Chatterji
Abstract
Recently, a novel hybrid power flow controller (HPFC) topology for FACTS was proposed. The key benefit of the new topology is that it fully utilizes existing equipment. The MATLAB/ SIMULINK models of three different configurations of HPFC have been investigated for their different characteristics, by incorporating them in the MM system. In all the three cases, the steady state stable values, time taken to attain stability, maximum overshoot and the rise-time of relative angular positions delt1-2, delt2-3 and delt3-1 have been found out by varying the FCT, and also by varying the damping constants. Any of these configurations can effectively be incorporated in a MM system and contributes highly in the transient stability enhancement of the system.
Download Full Text
Using Clustering and Indexing to Enhance Customer Relationship Management Based on Customer and Product Value Estimation - A Neural Networks Approach
Jayaraj V., Jegathesh Amalraj J.
Abstract
This paper deals with providing an optimal solution that optimizes the process of Customer Relationship Management. The process is initially performed by using the Clustering technique and when a new customer arrives, he is classified into any one of the existing Clusters which automatically gives us the properties for the customer. Here we perform a two phase Clustering which not only clusters on the basis of a single attribute, but performs a secondary clustering operation for better and accurate results. Further, all the Clustered information are indexed in such a way that Clustering becomes faster.
Download Full Text
Implementation of an Efficient Matrix based Algorithm for Clustering in Web Usage Mining
Kanika Gupta, Kirti Aggarwal and Neha Aggarwal
Abstract
Usage patterns discovered through Web usage mining are effective in capturing item-to-item and user-to-user relationships and similarities at the level of user sessions. However, without the benefit of deeper domain knowledge, such patterns provide little insight into the underlying reasons for which such items or users are grouped together. Furthermore, the inherent and increasing heterogeneity of the Web has required Web-based applications to more effectively integrate a variety of types of data across multiple channels and from different sources. Thus, a focus on techniques and architectures for more effective integration and mining of content, usage, and structure data from different sources is likely to lead to the next generation of more useful and more intelligent applications, and more sophisticated tools for Web usage mining that can derive intelligence from user transactions on the Web. This paper gives an insight into the proposed MABAC Algorithm and the implementation to show how it provides useful information within clusters.
Download Full Text
Fractional Fourier Transform of Tempered Boehmians
S. B. Gaikwad
Abstract
Tempered Boehmians are introduced as a natural extension of tempered distributions. For this class of Boehmians it is possible to define an extension of the Fractional Fourier transforms. The Fractional Fourier transform of tempered Boehmian is a distribution. Distributions, which are transforms of Boehmians, are characterized and an inversion theorem is proved.
Download Full Text
Investigation of Zero Chromatic Dispersion in Square Lattice As2Se3 Chalcogenide Glass PCF
Ravindra Kumar Sharma, Kirti Vyas and Navneet Jaroli
Abstract
We demonstrate a method to design As2Se3 glass photonic crystal fibers with flattened dispersion. Here, three new square lattice with seven layer As2Se3 glass photonic crystal fibers (PCFs) are designed and proposed. The numerical investigation shows that a square lattice PCF shifting of zero and flattened dispersion wavelengths toward higher wavelength range with the remove first air hole ring. A scalar effective index method (SEIM) and transparent boundary condition (TBC) are used for comparing zero dispersion.
Download Full Text
Facial Expression Recognition System: A Practical Implementation
Kamal Kumar Ranga, Arpita Nagpal
Abstract
Facial expression is one of the most powerful and immediate means for human beings to communicate their emotions, intentions, and opinions to each other. Facial expressions also provide information about cognitive state, such as interest, boredom, confusion, and stress. Facial expressions are natural and can express emotions sooner than people verbalize their feelings. It conveys non-verbal cues, which play an important role in interpersonal relations. Facial expressions recognition technology helps in designing intelligent human computer interfaces. In this paper facial expression recognition technique has been performed on the Indian faces extracted from a video. Initially, a live video of Indian college students is given as input to Haar classifier which traces out the faces from it. Then 42 facial feature points are detected using Active Appearance Model (AAM) technique using which we extract the facial features that are to be mapped on the extracted faces. In the last step four primary facial expressions (happy, sad, surprise, angry) have been classified using the technique support vector machine (SVM). It was very a challenging task to integrate these techniques of artificial intelligence and obtain a reasonable performance. The facial expressions recognizer proposed here gave 83% accuracy.
Download Full Text
Replacement of Graphic Translations with Two-Dimensional Cellular Automata, Twenty Five Neighborhood Model
Fasel Qadir, Shah Jahan Peer M. A. and Khan K. A.
Abstract
Images are being processed and manipulated at every step with the help of modern multimedia tools. In most gaming devices and cartoon series, movement of images are confined or restricted to right, left, up & down directions only. Cellular automata can be successfully applied in image processing. Cellular Automata is a methodology that uses discrete space to represent the state of each element of a domain and this state can be changed according to a transition rule. With our scheme, we are not only able to translate the image into x and y-axis, but also diagonal translations can be achieved. Uniform cellular automata rules are constructed to transform the images in all the directions.
Download Full Text
Enhanced Least Significant Bit algorithm For Image Steganography
Shilpa Gupta, Geeta Gujral and Neha Aggarwal
Abstract
The rapid development of data transfer through internet has made it easier to send the data accurate and faster to the destination, but in order to transfer the data securely to the destination without any modifications, there are many approaches like steganography. This paper introduces the concept of steganography using a new Algorithm “Enhanced LSB Algorithm”, which has negligent distortion as compared to the Least Significant Bit Algorithm.
Download Full Text
Evaluating the performance of Symmetric Key Algorithms: AES (Advanced Encryption Standard) and DES (Data Encryption Standard)
Shanta, Jyoti Vashishtha
Abstract
Encryption algorithms are known to be computational intensive. Internet and networks applications growing very fast, so the needs to protect. Encryption plays the very important role in information security system. On the other side, the symmetric key algorithms consume a significant amount of computed resources such as speed, time, memory and cost. The symmetrical algorithms AES and DES as a security enhancement. Current AES and DES standards are applied to encrypt and to protect. This paper provides evaluation of the most common encryption algorithms namely: AES (Rijndael) (Advanced Encryption Standard) and DES (Data Encryption Algorithm). A comparison has been conducted for those encryption algorithms at different setting speed, time and cost. Our results show that AES is more suitable than DES.
Download Full Text
Offline Signature Verification Using Neural Network
Upasana Dewan, Javed Ashraf
Abstract
Even today an increasing number of transactions, especially financial, are being authorized via signatures, hence methods of automatic signature verification must be developed if authenticity is to be verified on a regular basis. Verification can be performed either Offline or Online based on the application. Online systems use dynamic information of a signature like velocity, acceleration and pressure captured at the time the signature is made. Offline systems work on the scanned image of a signature. In this paper we present a method for Offline Verification of signatures using a set of simple geometric features. The features that are used are Token length, Average values, Trend Coefficients and Standard Deviations of observation components. Before extracting the features, pre-processing of a scanned image is necessary to isolate the signature part and to remove any spurious noise present. The system is based on backpropagation neural network and is initially trained using a database of signatures obtained from the individual whose signatures have to be authenticated by the system. Then another set of test signatures of the same person are input to the system to check whether they are genuine or forgery. We either accept or reject the test signatures by using a suitable threshold. If the magnitude of the output of the neural network is less than a pre-defined threshold (corresponding to minimum acceptable degree of similarity), the test signature is verified to be genuine else detected as a forgery.
Download Full Text
Software Reliability
Sushma Malik
Abstract
Unreliability of any product comes due to the failures or presence of faults in the system. As software does not “wear-out” or “age”, as a mechanical or an electronic system does, the unreliability of software is primarily due to bugs or design faults in the software. Reliability is a probabilistic measure that assumes that the occurrence of failure of software is a random phenomenon. Randomness means that the failure can’t be predicted accurately. Software reliability is the probability of the failure free operation of a computer program for a specified period of time in a specified environment. Software Reliability is dynamic in nature. It differs from the hardware reliability in that it reflects design perfection, rather than manufacturing perfection. Software reliability refers to the probability of failure-free operation of a system.
Download Full Text
Permuted Diagonal Maximum Weight Matching (PDMWM) Scheme for Cell Scheduling in Fixed Length Packet Switches
Sunil Kore, Ajinkya Biradar, Sayali Kore
Abstract
Explosive growth in internet is demanding very fast switching fabric in internet routers and switches. Packets need to be buffered at input or output or on both sides of crossbar switching fabric. Crossbar switches are used for switching because of no bandwidth limitation & high scalability. Its well-known fact that buffering of packets on outside of switch demands switching fabric to be ‘N’ time faster whereas buffering packets on input side limits throughput to 58%. Combined input output queued (CIOQ) switch demands that switch fabric to run at speed up of 2. Hence VOQ (Virtual output Queue) i.e. ‘N’ queues per input port i.e. total N2 queues on input side are suggested to resolve problem of throughput limitation of 58% in input queued switches. In VOQ throughput achieved is 100%. Selection of packets is key issue in VOQ various schemes like, MWM, MSM and maximal matching, are suggested by researchers in last two decades to improve the performance in terms of throughput and delay. We are addressing Permuted Diagonal Maximum Weight Matching (PDMWM) Scheme which provides 100% instantaneous throughput in each slot under heavy traffic conditions & improves delay performance. Our scheme PDMWM is computationally complex for large size switches but it outperforms at lower size switches and provides optimal performance nearer to output queued switch.
Download Full Text
A Novel Approach for Watermarking using Dual Watermarking Technique
Rakhee Lakhera, Shital Behare, Alka Gulati
Abstract
A Wavelet Watermarking and frequency domain feature based on adaptive watermarking algorithm is proposed. Many existing steganography methods hide more secret data into edged areas than smooth areas in the host image, which does not differentiate textures from edges and causes serious degradation in actual edge areas. For providing more security to the information and to enhance the hiding capacity of an image and avoid abrupt changes in image edge areas, as well as to achieve better quality of the stego-image, a novel image data hiding technique by Wavelet Watermarking and frequency domain is proposed in this paper. The paper focuses on Image Adaptive watermarking methods in the Discrete Wavelet Transform Domain and frequency domain since they yield better results regarding robustness and transparency than other watermarking schemes. The Discrete Wavelet Transform Domain and frequency domain dual adaptive watermarking algorithm which will be used for the novel application for watermarking. It helps, for providing more security to the information.
Download Full Text
Comparative analysis of mid-point based and proposed mean based K-Means Clustering Algorithm for Data Mining
Kirti Aggarwal, Neha Aggarwal
Abstract
In the original k-means algorithm the initial centroids are taken just randomly out of the input data set. But this random selection of initial centroids leads the computation of the algorithm into local optima. Each time the end clustering results will come out to be different. This is the limitation which needs to be dealt with in order to make the k-means algorithm more efficient. The mid –point is used as a metric for computing the initial centroids but this algorithm may be suitable for a wide variety of problems but it is not suitable for all kinds of problems. As it concentrates on calculating the mid-point of different subsets of the data set, so it is most suitable to problems where the input data is regularly or uniformly distributed across the space. But in the situations where the input data is irregular or non-uniformly distributed, this algorithm will not produce the appropriate results. This paper presents the mean as the metric for choosing initial centroid and the comparison of both the algorithms.
Download Full Text
Classifying Five Different Arrhythmias by Analyzing the ECG Signals
Anup M. Vanage, R. H. Khade and D. B. Shinde
Abstract
An electrocardiogram (ECG) is a bioelectrical signal which records the heart's electrical activity versus time. It is an important diagnostic tool for assessing heart functions. The early detection of arrhythmia is very important for the cardiac patients. ECG is a test that measures the electrical activity of the heart. The signals that make the heart's muscle fibers contract come from the sinoatrial node, which is the natural pacemaker of the heart. In an ECG test, the electrical impulses made while the heart is beating are recorded and usually shown on paper, and records any problems with the heart's rhythm. This provides the conduction of the heart beat through the heart which may be affected by heart disease. This paper aims in detecting and classifying different types of arrhythmias which is done by analyzing the electrocardiogram (ECG) signals and extracting some features from them. In this paper five diffent classes are classified which are Supraventricular arrhythmias(svdb), Maligant ventricular Ectopy database (vfdb), Congestive heart failure(afdb), Post-Ictal heart rate oscillations in partial Epilepsy (szdb) and Normal sinus rhythm(nsrdb).Three different algorithms: FFT, AR and PCA are used for features extraction. The used classifier ANN. The proposed techniques deal with the whole 3 second intervals of the training and testing data. The processed signal source came from the Massachusetts Institute of Technology Beth Israel Hospital (MIT-BIH) arrhythmia database.
Download Full Text
A Novel Technique for Data Hiding in Audio by Using DWTS
Preeti Jain, Vijay Kumar Trivedi
Abstract
A secure data transfer is limited due to its attack made on data communication internet community. Audio data hiding can be used anytime you want to hide data. There are many reasons to hide data but most important is to prevent unauthorized persons from becoming aware of the existence of a message. Hidden message is information that is not immediately noticeable, and that must be discovered or uncovered and interpreted before it can be known. In this paper, we propose a novel approach for high capacity audio steganography algorithm based on the wavelet packet transform with adaptive hiding in least significant bits. The adaptive hiding is determined depend on the cover samples strength and bits block matching between message and cover signals. We propose two main stages, Input stage and secret message hiding stage. After the stages of input audio segmenting hide secret message by using preprocessing and developed in matlab.
Download Full Text
FPGA Implementation of π/4 -QPSK Modulator and Demodulator
A. M. Bhavikatti, Subhash Kulkarni, Uday Kalyani
Abstract
Here, we are presenting the FPGA based simulation of π/4- QPSK modulator and demodulator. π/4- QPSK is widely used modulation scheme in satellite radio applications. Complete modulator and demodulator units will be modeled using VHDL and their functionality are verified using modelsim simulation tools. The code was synthesized on the Spartan-3E FPGA .Simulation waveforms and a brief synthesis report is also presented in this paper.
Download Full Text
DTC SVPWM: Advanced Techniques for Reduced Common Mode Voltage
Raveendra Reddy V., Veera Reddy V. C.
Abstract
This paper presents a new direct torque control (DTC)algorithm that is space vector pulse witch modulation(SVPWM) with imaginary switching states for induction machine drives capable of reducing the common –mode (Vcm)conducted emissions of the drive.It is based on the application of only odd or even voltage vectors in each sector in which the stator flux lies.In conventional SVPWM the reference vector is generated by time averaging the two near by active voltage vectors and two zero voltage vectors in every sample time (Ts).In new SVPWM the common mode emissions has been reduced and the complexity involved in calculating the Vref is decreased.
Download Full Text
Assessment of Individual Income Tax, Tax Planning and Saving in India
Rajiv Kaushik
Abstract
Individual income tax is a subject matter of central govt. If an individual want to assess his/her income tax then he/she should have knowledge of individual income tax structure. Individuals after calculating their total income for a particular financial year can assess their income tax after deduction of saving and doing other adjustments. By doing so they can plan in advance about their savings and income tax.
Download Full Text
Spam Filtering and Removing Spam Content from Massage by Using Naive Bayesian
Abha Suryavanshi, Shishir Shandilya
Abstract
As the spam problem grew larger, the interest in spam filters grew accordingly. As with any new technology, new terms were introduced that are used frequently in this research work that needs to be explained. We refer to email that is not spam as ham. Spam will cost the spammer very little to send since most of the cost is absorbed by the recipients or by the carriers. Furthermore, sorting out the unwanted messages takes time and introduces a risk of deleting normal email by mistake. Most of them are responsible in eliminating such emails from the recipients’ mailbox. Regarding to this problem it seems necessary for us to have a good spam filtering system, which is able to classify the incoming emails into the mailbox whether it is legitimate or spam email. The reduction of dimensionality by selecting new attributes which is subset of old attributes is known as feature selection. In this paper we not only filter spam but also categorize spam. And remove spam word or contents from spam mail. And regenerate that massage and saved as no spam massage. Blacklists stop known spammers eliminate spam sent by spam-bots, decoy email boxes alert to new spam attacks, and Bayesian filters detect spam and virus attack emails right from the start, even before the blacklist is updated. along with the bouncing emails created by viruses. In this paper we not only filter spam but also categorize spam. And remove spam word or contents from spam mail. And regenerate that massage and saved as no spam massage was successfully.
Download Full Text
An Efficient Image Compression Technique using 2D-DWT and FELICS Algorithm for Different Class of Images (COI)
Sunil Kore, Akshay Bhosale
Abstract
Image compression has become an important process in today’s world of information exchange. Image compression helps in effective utilization of high speed network resources. In surveillance application, image compression has played vital role. Efforts are made to enhance image compression ratio by combination of lossy and lossless compression technique. Here in this system, first the image is compressed using lossy 2D-DWT technique and then it is further compressed using a VLSI oriented lossless FELICS algorithm. The 2D-DWT uses haar wavelet for lossy image compression. The FELICS algorithm uses simplified adjusted binary code and Golomb-Rice code with storage-less k parameter for lossless image compression. Here, three different techniques viz. FELICS, JPEG and the proposed technique having a combination of 2D-DWT and FELICS are used to compress a class of images (COI). Different authors have worked with image compression technique and have measured one or two image quality parameters. Here we have worked with seven different image quality parameters with different class of images. Here combination of lossy and lossless compression techniques is used to improve overall compression ratio.
Download Full Text
Diagnosis of Diabetes in Female Population of Pima Indian Heritage with Ensemble of BP Neural Network and SVM
Rahmat Zolfaghari
Abstract
Diabetes mellitus is one of the most serious health challenges facing American Natives in the United States today. The publicly available Pima Indian diabetic database (PIDD) at the UCI Machine Learning Lab has become a standard for testing data mining algorithms to see their accuracy in predicting diabetic status from the 8 variables given. In this study we will try to predict the presence of diabetes based on ensemble of SVM and BP NN. The predictive accuracy was 88.04 which was the best accuracy and it was very promising with regard to the other classification systems in the literature for this problem.
Download Full Text
Artificial Neural Networks & Mathematical Models – A Comparison Study for Stock Market Volatility
J. K. Mantri
Abstract
The present study emphasizes on a comparison study among the Mathematical models- GARCH, Parkinson, Roger Sactchell & Artificial Neural Network models for calculating the volatilities of NSE & BSE. The performance of data exhibits that, there is no significant difference in the volatilities of Nifty & Sensex estimated under the Mathematical models & ANN models. Hence ANN model can be used more than others for calculating volatilities due to its robustness & fault tolerance characteristics.
Download Full Text
Nurturing Cult Brands
Rajiv Kaushik
Abstract
Cult brands are the brands which have passionately loyal following, brand loyalty, customers organized around a cause, parodied in popular culture, strong detractors, stood the test of time, events organized to celebrate the brand and first of its kind in its industry. Cult brands are different from iconic brands in terms of culture, society, value and uniqueness.
Download Full Text