NCSC2D-2016-3rd National Conference on Sustainable Computing and Communication for Development of Nation

"NCSC2D 2016 Conference Papers "

[1] [2] [3]
A Survey of Fault Management System in Wireless Sensor Network[ ]


In most of the cases deployment of wireless sensor nodes is at remote location, where fault detection & recovery is crucial task. Fault Management Framework is the set of functions that detect, isolate, identify and correct malfunctions. Fault Management System include algorithms for testing, diagnosing or repairing the network failures. Fault detection module will have collection of all possible symptoms of probable faults. Fault isolation module will observe alarm indication and possibilities of cause and design hypothesis. Based on alarm available fault identification module will test proposed hypothesis suggested by fault detection module and identify fault. Post identification, detected fault should not affect network performance. Fault recovery module will reduce the effect of fault as well repair and reconfigure the fault. The aim of this paper is to study state of the art research solutions to detect faulty nodes in wireless sensor network. Further it provide observations, directions and scope of research work for future improvements. This paper may be a good starting point for those who want to pursue research in fault management area of wireless sensor network.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Security in Cloud Computing using Cryptographic Algorithms[ ]


Cloud Computing is a set of IT Services, for example network, software system, storage, hardware, software, and resources and these services are provided to a customer over a network. The IT services of Cloud Computing are delivered by cloud service provider who owns the infrastructure. Benefits of cloud storage are easy access means access to your knowledge anytime along with scalability, resilience, cost efficiency, and high reliability of the database. Because of these advantages and facilities each and every organization or company is moving its data to the cloud, means it uses the storage service provided by the cloud provider. So there is a need to protect that data against unauthorized access, modification or denial of services etc. To secure the Cloud means secure databases hosted by the Cloud provider. In this research paper, the proposed work plan is to eliminate the concerns regarding data privacy using multilevel cryptographic algorithms to enhance the security in cloud as per different perspective of cloud customers.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Friend Recommendation System: A Review[ ]


Nowadays use of social networking site is increasing exponentially. Almost 90 to 95 percent of the young generation (especially students) are using social networking services. Every social networking site is using different factors for recommending friends. These factors may include social graph, tastes, moral standards, habits, attitudes, profession, etc. Two decades ago, people typically made friends with others who live or work surround-ing to themselves like neighbors or colleagues. With the rapid advances in social networks, services such as Facebook, Twitter, Google+ have provided us new ways of making friends. So there is a big challenge with the social networking services which is how to recommend a good friend to a user.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Artery/Vein Classification of Retinal Images to Identify Diabetes Using Graph Based Approach[ ]


The prevalence of diabetes is expected to increase; already today it accounts for a large number in many countries. In this paper, we can able to detect diabetes by automating the vascular changes in retinal images. The extraction of retinal vessel is an important phase for automating the detec-tion of vascular changes. The retinal vessels are of two types viz. Arteries and veins. Any changes may lead to several diseases. This paper proposed a graph based approach for artery-vein classification which classifies the entire vascular tree deciding on the type of intersection point and assigning one of two labels to each vessel segment. Finally by comparing the diagnostic indicator like arteriolar-to-venular ratio: the disease like diabetes can be de-tected.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
REVIEW ON SELF ADAPTIVE SEMANTIC INFORMATION BASED CRAWLERS FOR DATA MINING SERVICES[ ]


In today’s world, online advertisements are very famous with number of industries, which includes mining service industry. Web crawler are of the most critical components used by the search engines to collect pages from the web. It is an intellectual technique of brows-ing used by the search engines. Service users may faces the three major problems- heterogeneity, ubiquity and ambiguity at time of search-ing for information over the internet. The framework of a self-adaptive semantic focused crawler i.e. SASF crawler have the purpose of pre-cisely and efficiently discovering, formatting and indexing mining ploy information over the internet by taking into account these three major problems. This framework incorporates the technologies of semantic focused crawling and ontology learning to maintain the performance of this crawler. In this paper number of literature survey are taken into account with their drawback.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Ascent Correction in Text to Speech Using Segmentation Methodology[ ]


Text-To-Speech System (TTS) is a computer-based system that automatically converts text into artificial human speech. To build a natural sounding speech synthesis system, it is essential that the text processing component produce an appropriate sequence of phonemic units correspond-ing to an arbitrary input text. Text processing and speech generation are two main components of a text to speech system. The process of transforming text into speech contains coarsely two phases: first the text goes through analysis and then the resulting information is used to generate the speech signal.proposed PSOLA(Pitch synchronous overlap add technique)can change flexibly the rhythm of speech naturalness without changing the details of the original speech segments to obtain a higher clarity and naturalness. MFCC is based on the human peripheral auditory system. The performance of the Mel-Frequency Cepstral Coefficients (MFCC) may be affected by the number of Filters and type of window.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Image Processing Using Distributed Environment Approach[ ]


Image recognition is an important processing step in many image, video and computer vision applications. Research has been done in creating algorithms for image recognition, but it still faced the problem of complexity, performance, efficiency, reliability and accuracy. In the world of computer security, biometrics is important authentication techniques that authenticate the authorized person based on measurable physiological and individual characteristics. Face detection is the important step among all facial analysis algorithms, including face alignment, face recognition, face verification and face authentication. In the field of face detection, significant progress has been done in the past. The work by Viola and Jones has made face detection practically feasible in real world applications such as digital cameras and photo organization software. The research is based on the fusion of face image and fingerprint image. For detection of face Viola-Jones algorithm is used. This algorithm detects individual’s face, nose, eyes, and mouth. For detection of fingerprint image, minutia algorithm is used. The proposed model of Image recognition, recognize an image to further analyze each of these objects present in the image to extract some high level information and compare the face and fingerprint image on the basis of extracted features of images.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Single-Sensor Hand Geometry and Palmprint Verification System[ ]


Several contributions have shown that fusion of decisions or scores obtained from various single-modal biometrics often enhances the overall system performance. Biometric system is becoming increasingly important, since they provide more reliable and efficient means of identity verification. The physical dimensions of a human hand contain information that is capable of authenticating the identity of an individual. The proposed system is a verification system which utilizes these hand geometry features for user authentication and also uses a scanner as sole sensor to obtain the hands images. The hand geometry verification system performs the feature extraction to obtain the fingers and palm. The region of interest (ROI) is detected and cropped which is uses by palmprint. This ROI acts as the base for palmprint feature extraction by using Linear Discriminant Analysis (LDA). The matching scores use in the algorithms namely sum rule, weighted sum rule and Support Vector Machine (SVM). The results of the fusion algorithms of palm and hand geometry classifiers. Fusion using SVM with Radial Basis Function (RBF) kernel .Biometric user recognition system based on hand geometry. The goal of a biometric verification system consists in deciding whether two characteristics belong to the same person or not.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Review on Lung Nodule Detection Using Patched Based Context Analysis Method With Support Vector Machine[ ]


Earlier Prediction of lung cancer is very essential and crucial task in healthcare industry. Lung Cancer is the leading cause of death, so it is highly essential to detect and predict it at initial stage. In medical world, diagnostic imaging is an essential and important tool for early detection of diseases. Lung nodule, which leads to lung cancer disease is predicted by image processing techniques in the medical field which include the Computerized Tomography CT images. The novel classification method for four types of lung nodules are well-circumscribed, vascularized, juxta- pleural and pleural-tail respectively.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey on Personalized Web Search Engine using Data Extraction from Multiple Web Databases[ ]


Web databases produce question result pages in view of a client's inquiry. The goal of proposed framework is to concentrate organized information which are the pages containing arrangements of information records from a gathering of pages from various web information bases and adjust them in one configuration, so client can get more significant information. Consequently extricating the information from these inquiry result pages is vital for some applications, for example, information combination, which need to coordinate with various web databases. For this, information extraction and arrangement strategy are proposed. For extraction, CTVS that consolidates both label and esteem comparability strategies are utilized to extricate the information from various web databases. For Alignment, re-positioning routines are proposed which utilizes semantic comparability to enhance the nature of list items. Bring the top N results returned via internet searcher, and use semantic similitudes between the applicant and the inquiry to re-rank the outcomes. To start with proselyte the positioning position to a significance score for every applicant. At that point consolidate the semantic closeness score with this introductory significance score lastly get the new positions. Utilizing the significance score for every website page framework figure out the pertinence of information. At last adjust the information in dropping request from that score.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Sentiment Analysis Based Feedback Analysed Service Recommendation method For Big Data Applications[ ]


There is a rapid growth of customers, online information and services in recent years. Thus a personalized suggestion is required. Many traditional recommender systems are available but they lack in scalability, data security and efficiency. Recommender systems are the systems which has power to analyse historical data, reviews and can give individualized suggestions accordingly. Analysis of such a large amount of data that is “Big Data“ is a major problem. Big Data is the term used when amount of data exceeds the current technology, method and theory to capture and process the data in required time. This big data is in the form of media logs, twitter and facebook logs. Thus no schema is defined, and it can not be analyzed using SQL. Cloud computing technology such as Hadoop is used to tackle this problem. Hadoop can analyse such data. There is no restriction on type of data it may be structured, unstructured or semistuctured. In this paper Feedback Analysis is done using Sentiment Analysis to Recommend services. Keywords are used to indicate what the users prefer, more than two keywords can be used. Hadooop can analyse data more efficiently and data security is also provided.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Light Fidelity (Li-Fi) – The Resent Trend in Wireless Network Communication[ ]


The Light Emitting Diodes (LED) technology is used in different areas of our everyday life. These LED are winding its application areas as it provides us not only lightening capabilities, but also it is used for data transmissions as well. In this paper, we are going tostudy the technology of Light Fidelity (Li-Fi) based on the LED and its applications in transferring data. By using this Li-Fi one can transfer data from one computer system to another computer system. Li-Fi stands for Light Fidelity, which is proposed by the German physicist popularly named as “Harald Haas”. He gives the solution for illumination of light for data transfer. Using this technology, it is easy to provide transmission of data through illumination by sending data from LED light bulb. This technique is much more efficient that is faster than the human eye can follow. Li-Fi is more secure than the current Wi-Fi technology as it is using VLC and not by radio-waves for the transmission of data. It provides better efficiency, greater bandwidth and larger speed. The technology of Li-Fi is implemented through visible light communication (VLC) technology. This VLC technology uses the medium that deliver high-speed data communication for our Li-Fi technology in a manner that is similar to Wi-Fi. As todays needs is of growing needs for faster rate of da-ta transmission, Li-Fi is trying to meet this requirement by offering much faster data rate than Wi-Fi technology. In this paper, the authors present the detailed study of the Li-Fi technology, its advantages and limitations.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Review On Compressive Sensing[ ]


Wireless sensor networks consists of spatially distributed sensor nodes. These sensor nodes communicates with each other for transferring data from one node to another. Energy is consumed by the sensor nodes while transfer of data. So the energy consumption is high in wireless sensor networks. To reduce the energy consumption the compressive sensing method is given known as compressive sensing. Clustering is made in wireless sensor network for effective communication. Compressive sensing method used to reduce the energy consumption by the sensor nodes so that the energy consumed by the wireless sensor network should be less.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Exit Survey Analysis Using Opinion Mining[ ]


Opinion mining is the field of study that analyzes the opinions of users, sentiments, evaluations, attitudes, and emotions from social media such as reviews, forum discussions, blog, micro-blogs, Twitter, and social networks. This has led to the emerging field of opinion mining and sentiment analysis, which deal with information retrieval and knowledge discovery from text using data mining. Opinion mining also called sentiment analysis, involves edifice a scheme to collect and classify opinions about a product. It is a process for tracking the humor of the people about a certain product. Due to cheap availability of internet, people are more dependent on Internet. People purchase product on internet and gives their opinion about the product. There are lots research has been done in the area of opinion mining or sentiment analysis. In this paper we presented the overview on Opinion Mining or Sentiment analysis.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
REVIEW OF MULTICORE PROCESSING SYSTEM[ ]


Microprocessor has completely revolutionized the world we live in and continuous efforts are being made to manufacture not only faster chips but also smarter ones. A number of techniques such as data level parallelism, instruction level parallelism and hype threading already exists which have dramatically improved the performance of microprocessor cores. Advances in IC processing allow for more microprocessor design options. As Chip Multiprocessor system (CMP) become the predominant topology for leading microprocessors, critical Components of the system are now integrated on a single chip. This enables sharing of computation resources that was not previously possible. In addition the virtualization of these computation resources exposes the system to a mix of diverse and competing workloads. On chip Cache memory is a resource of Primary concern as it can be dominant in controlling overall throughput. This Paper presents analysis of Multi-core Architectures like varying the number of cores, recovery and performance related study.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
INTEGRATION OF SOUND SIGNATURE IN GRAPHICAL PASSWORD AUTHENTICATION SYSTEM[ ]


In proposed work a click-based graphical password scheme called Cued Click Points (CCP) will be implemented. Here a graphical password system with a supportive sound signature to increase the remembrance of the password is discussed. In this system a password consists of sequence of some images in which user can select one click point per image. In addition user is asked to select a sound signature corresponding to each click point this sound signature will be used to help the user in recalling the click point on an image. This Systems will give very good Performance in terms of speed, accuracy, and ease of use. Here we will use preferred CCP to Pass Points, considering that selecting and remembering only one point per image is easier and sound signature helps considerably in recalling the click points.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
The System comes from VANET and Cloud[ ]


In Vehicular Ad Hoc Networks (VANETs), selfish nodes (vehicles) who do not contribute to the networks are considered as a threat, so vehicles are forced to participate into the network by sharing the information they collect on the road, whereas participation is optional for nodes in vehicular cloud and they are encouraged to participate with incentives as the network utilizes vehicles' underutilized resources to provide services. Hence, it is necessary to have an integrated incentive system for vehicular cloud, which combined VANET and cloud. The incentive system should be able to cover the entire vehicular cloud, including vehicles in the VANET and service providers in the cloud. In addition, the system must be robust against attacks. In this paper, we present an efficient scheme, Secure Token Reward System in Vehicular Clouds. It ensures the scalability of the incentive system and provides mechanism for securing incentive-related messages. Also, issued tokens can be used by vehicles to get services from the cloud.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Improving Access Efficiency of Small Files in HDFS[ ]


The Hadoop Distributed File System (HDFS) is designed to store very large data sets reliably, and to streamthose data sets at high bandwidth to user applications. It is designed to handle large files. Hence, it suffers performance penalty while handling a huge number of small files. Further, it does not consider the correlation between the files to provide prefetching mechanism that is useful to improve access efficiency. In this paper, we propose a novel approach to handle small files in HDFS. The proposed approach combines the correlated files into one single file to reduce the metadata storage on Namenode. We integrate the prefetching and caching mechanisms in the proposed approach to improve access efficiency of small files. Moreover, we analyze the performance of the proposed approach considering file sizes in range 32KB-4096KB. The results show that the proposed approach reduces the metadata storage compared to HDFS.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Angular JS[ ]


Science gateways provide user-centric views to cyber infrastructure resources, simplifying usage and enabling a richer user experience. To enable the goals of a science gateway and the communities of scientists it supports, gateway developers need to be able to spend more time on designing and developing the user experience and less time on wrestling with the underlying technology (such as HTML5, CSS, and JavaScript). In this seminar, we describe our experiences using Twitter Bootstrap and Angular JS frameworks to address this balance between design and implementation, empowering developers to create better styled and easily maintainable websites.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
IaaS Cloud Computing using Distributed Protocol[ ]


Cloud Computing provides various economical benefits. Security issues are raised while working with cloud computing. In this paper we propose a novel technique that will allow consumer to work in cloud environment. Firstly we present theoretical analysis of selected technique and identified issues in IaaS cloud computing. Secondly we propose scattered Trust Protocol for IaaS Cloud Computing in order to make connection between users more secure. This is a distributed protocol that makes user feel comfortable in cloud computing platform. We follow the rule of safety responsibility division between the property of consumer and provider. We make the consumer actual owner of the system. In our protocol, following specification of Trusted Computing Group (TCG) and Trusted Platform Module (TPM) Chip of the consumer are utilized. The protocol is for the Infrastructure as a Service IaaS.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
CLOUD FORENSICS: INTRODUCTION AND CHALLENGES IN RESEARCH[ ]


Cloud computing is a new model which provides the utility services for shared virtualized resources. It is visualized that in future, Cloud computing can offer everything as a service (EAAS). Cloud Service Provider (CSP) makes infrastructure, platform and software services available over the Internet flexibility at a lower cost. Cloud computing and digital forensics is both developing topics and researching these topics requires an understanding of the main aspects of both cloud computing and digital forensics. Cloud forensics is an approach that attempts to investigate and analyze cloud security threats. It will ensure that attackers will be more cautious to avoid prosecution for their illegal actions. It acts as a deterrent, reducing network crime rate and improving security. The paper aims to provide a better awareness of cloud forensics, understand some of the proposed frameworks and identify the research gaps and challenges. The significance of this work is that it presents the state-of-the-art in cloud forensics, which will be very much useful for security practitioners and researchers.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Palladium Cryptography Palladium Cryptography Next Generation Security Computation Base[ ]


In today's world when man is mostly dependent on gadgets for his work, security is a topic which can't be ignored. In this paper we will be dealing a new cryptographic technique known as Palladium Cryptography which deals with the change in the basic hardware architecture to facilitate security. We will be looking at the various architectural modifications and will be describing the hardware and software components in depth. We will discover the working mechanism of palladium. We will also explore the various uses of palladium followed by its cons and will end up with a conclusion.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Heterogeneous Graph Propagation for Large-Scale Web Image Search[ ]


State-of-the-art web image search frameworks uses the bag-of-visual-words (BoVWs) model and the inverted index structure. In spite of its simple, efficient, scalable, they often face problem of low precision and/or recall, because it is not well stable of its local features and the information loss is high on the stage of quantization. To improve the quality of images that are retrieved, various post processing methods have been applied after the initial process of search. In this paper, we evaluate the online querying process from a graph-based approach. We introduce a graph model containing both image and feature nodes explicitly, and state an efficient approach of reranking consisting of two modules, i.e., incremental query expansion and image-feature voting. Like, conventional reranking algorithms, the method does not require usage of geometric information of visual words, hence enjoys low consumptions of both time and memory. The method is also independent of the initial search process, and could work with many BoVW-based image search pipelines, or applied after other post processing algorithms. We study our approach on big-scale image search tasks and verify its competitive search performance.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Click Prediction For Web Image Reranking Using Multimodal Sparse Coding[ ]


For the improvement of the performance of a text-based image search, Image reranking is a effective method. There are two reasons for which the reranking algorithms are limited and they are: One is that the data that is associated with images is not matched with the actual visual content and the second reason is that the reextracted visual features do not accurately describe the meaningful similarities between images. The relation of retrieved images to search queries has been more precisely described by user clicks, in recent years. However, a the lack of click is the data critical problem for click-based methods, since users have clicked a small number of web images. Therefore, the solution to this problem is by predicting image clicks. A multimodal hypergraph learning-based sparse coding method is proposed for image click prediction, and apply click data that has been obtained to the reranking of images. To build a group of manifolds, a hypergraph is adopted. A hyperedge present in a hypergraph is the edge that connects a set of vertices, and preserves the constructed sparse codes. The weights of different modalities and the sparse codes are obtained by an alternating optimization procedure. Finally, to describe the predicted click as a click or no click, a voting strategy is used from the images that was corresponding to the sparse codes. Image reranking algorithms are used to improve the performance of graph-based the use of click prediction is shown by an additional image reranking experiments on real world data that is beneficial.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Analytical Survey On Big Data[ ]


Big data is the term for extremely large data sets that may be analyzed computationally to reveal patterns, especially relating to human behavior and interactions.. This information can be utilized to get distinctive results and forecasts utilizing diverse sorts of examination.. Data sets grow in size in part because they are increasingly being gathered by low price information-sensing mobile devices, aerial (remote sensing). Big data is hard to work with utilizing most social database frameworks and desktop measurements and perception bundles, requiring rather "hugely parallel programming running on tens, hundreds, or even a huge number of servers". Big data usually works with such large size sets that are above the limit of commonly used software tools to capture, curate, manage, and process data within a capable time of event occuring time. There is a set or a bunch of technologies and techniques that require some new forms of parts to uncover large values that are not accessible to view from large datasets that are distinct, complicated, and of a large scale which is called Big Data. Big data environment is used to acquire, organize and analyze the various types of data. There is an observation about Map Reduce framework is a framework that generates data in a large amount that lies between extremes within a particular time period and space. Therefore, as well as the tasks finishes there is need of throwing the data that is present in a large amount, because MapReduce is not able to put that data into the work .

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Preventing Discrimination In Data Mining[ ]


For extorting the helpful comprehension concealed in the biggest compilation of a database the data mining technology is used. Data mining is an increasingly important technology for getting useful knowledge hidden in large collections of data. Rules extracted from datasets by data mining techniques, such as classification or association rules, when used for decision tasks such as benefit can be discriminatory in the above sense. discrimination refers to unfair or unequal treatment of people on the basis of their membership to a category or a minority, without regard to individual merit.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Fusion Based Multimodal Biometrics using Face and Palm print at Various Levels[ ]


Biometrics have wide applications in the fields of security and privacy. As unimodal biometrics are subjected to various problems regarding recognition and security, multimodal biometrics have been used extensively nowadays for personal authentication. In recent years it is seen that researchers paying enormous attention to design effective multimodal biometric systems because of their ability to endure spoof attacks. Single biometric sometimes fails to extract adequate information for verifying the identity of a person. On the other hand, by combining multiple modalities, enhanced performance reliability could be achieved. In this paper, we have fused face and palm print modalities at all levels of fusion e.g. sensor level, feature level, decision level and score level. For this purpose, we have selected modality specific feature extraction algorithms for face and palm print such as LDA and LPQ respectively. Popular databases AR (for face) and PolyU (for Palm print) were considered for evaluation purposes. Severe experiments were conducted both under clean and noisy conditions to ascertain robust level of fusion and impact of fusion strategies at various levels of fusion for these two modalities. Results are substantiated with appropriate analysis.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Security Challenges and Issues in IoT[ ]


IoT (Internet of Thing) is the concept in which various objects are connected to each other and have the ability to exchange data over network. In near future more than 40 million various devices are going to get connected via internet, all these devices are going to interact among themselves. They share information, if this information contains sensitive data then Security is one of the aspect which cannot be ignored. This paper aims at the security challenges and issues related to IoT.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Analysis And Data Mining Techniques In Social Media[ ]


Huge impact of social media that internet user average almost spends 2.5 hours daily on liking, tweeting on social media, which has become vast source of structured data. It has become necessary to find out value from large data sets to show associations, as well as to perform predictions of outcomes and behaviours. Big Data has been characterized by 4 Vs – Volume, Velocity, Variety and Veracity. Unfortunately, these very large databases (VLDB) are not able to analyse in a judicious volume of spell and budget. Clustering is one such operation to group similar objects based on their connectivity, relative compactness or some specific characteristics. Calculating the correct number of clusters and its feature calculation are the main issues.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Detection of Vampire Attack and Prevention for MANET[ ]


Mobile means able to move and ad hoc means transient without any fixed infrastructure so mobile ad hoc networks are a kind of transient networks in which nodes are able to move without any fixed infrastructure or centralized administration. Mobile ad hoc networks (MANETs) represent complicated dispersed systems that comprise wireless mobile nodes that can freely and actively self-organize into random and transient network topologies.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
WebRTC Implementation and Architecture Analysis[ ]


Web browsers are becoming powerful day-by-day, with increased usage of interactive multimedia applications. Web Real Time Communications is a standard to enable real time multimedia feature in web browsers without the need of third party plugins. The media capabilities of WebRTC standards are state-of-the-art, with many new features. Features like supporting peer-to-peer interactive multimedia communications between web browsers. This paper analyzes implementation and architecture of web real time communication (WebRTC) API along with its limitations. It also analyzes the Trapezoid and Triangle model of WebRTC architecture.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey of India's Smart Cities Mission[ ]


The Smart Cities Mission is an innovative and recent initiative by the Government of India to develop smart cities pan-India to enable economic growth and improve the quality of life of people by enabling local development and using smart technologies to make its citizens life better. The Mission will be covering 100 cities and its duration of implementation will be five years (FY2015-16 to FY2019-20). The Mission may be continued thereafter on the basis of the output of an evaluation to be done by the Ministry of Urban Development (MoUD) and abs orbing the learnings into the Mission. The Smart Cities Mission is aimed to set examples that can be replicated both within and outside the Smart City, s peeding up the creation of similar Smart Cities in various regions and parts of the country. This paper takes the insight into the mission objective, the implementation guidelines and various challenges in mission’s implementation.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Analysis of Mobile Service Oriented Architecture for Different Applications[ ]


Today Mobile phones are widely used for delivery of business and government services. SOA light weight framework used to develop a wide range of business and government mobile applications. This paper aims to analysis of Mobile service oriented Architecture for different applications. This paper identifies the scope, recent trends and future scope of SOA based Mobile Services.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
DETECTING AND TRACKING DIS-FORMALITIES IN MEDICAL IMAGES THROUGH WIENER FILTER BASED IMAGE ENHANCEMENT: A REVIEW[ ]


The processing on an image in order to make it more appropriate for certain applications is called as Image enhancement. It seems to be very critical component to get a good segmentation, especially for x-ray images in medical application. The sharpness of the image and magnification of contrast will increase the accuracy of the subsequent modules for an autonomous disease diagnosis system. In last few years, the quality of the x-ray image can be enhanced by applying pre-processing method, which is first basic step in medical image processing, especially for segmentation and feature extraction. But still after applying this technique, the results are not that much accurate which are required for medical applications. So, an efficient technique is required that deals with improving the quality of an medical image by limiting the high noise characteristic of an X-ray image with the help of Histogram Equalization, followed by applying Wiener filter which will also prove a better segmentation and extraction scheme for an X-ray image.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Face Recognition using Principal Component Analysis[ ]


Image processing is process of pictures victimization mathematical operations by victimization any kind of signal process that the input is a image; the output of image process could also be either a picture. Image process sometimes refers to digital image process, however optical and analog image process are also potential. In camera work, pictures are manually made up of physical models of objects, environments, and lighting, rather than being non inheritable from natural scenes, as in most animated movies. Laptop vision, on the opposite hand, is commonly thought of high-level image process out of that a machine/computer/software intends to decipher the physical contents of a image or a sequence of pictures.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Review And Comparative Study Of Dual Stegnography Techniques For Embedding Text In Cover Images[ ]


In recent years, the rapid growth of information technology and digital communication has become very important to secure information transmission between the sender and receiver. One of the best techniques for secure communication is Steganography-a covert writing. The process of using steganography in conjunction with cryptography,called as Dual Steganography, develops a sturdy model which adds a lot of challenges in identifying any hidden and encrypted data. Encrypting any secret data before hiding in the cover object will provide double protection. In this paper we have discussed about the three different techniques of dual stegnography for embedding text in cover images such as Image steganography combined with DES encryption pre-processing, Dual layer security of data using LSB Image steganography method and AES encryption algorithm and last Stegnography inside stegnography. The Merits and drawbacks of each technique and their comparative study and decide whether which technique is more useful in future.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
DENOISING OF LOW LIGHT VIDEO USING WAVELET FILTER: A REVIEW[ ]


Video quality enhancement is a long standing area of research. Noise is dominant factor that degrades image quality. Video denoising is a challenging topic in image processing; its main aim is to achieve an efficient, adaptive and high quality video. Noise reduction and enhancement of extremely low light video have poor dynamic range. Dynamic range of denoised video is increased by adjustment of RGB histogram. Several types of filters are used to remove noise from video. Noise is removed using a nonlocal means (NLM) denoising filter. Usually the success of any algorithm depends on the method that helps to improve the quality of image.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Secure Multi Cloud Data Hosting Scheme with High Availability Using SIC Architecture[ ]


Hosting data on to the cloud reduces IT maintenance cost and enhance the data reliability. Nowadays cloud computing becomes a popular because of data reliability and low IT maintenance cost. In general customer put their data on single cloud. There are two drawback of putting data on single cloud, first is vendor lock in problem and second is security on cloud. The solution on this problem is putting data on different cloud server without redundancy using encryption algorithm. Cloud computing provide easy data retrieval. Cus-tomers do not want to lose their sensitive data on cloud. Another issue of cloud computing is data theft should be overcome to provide better service. Multi-cloud environment has ability to reduce the security risks. To avoid security risk we provide framework.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
The Selection of Programming Language to reduce Defect and increase Quality[ ]


There are many different programming languages used for development of various applications. The defects are associated with programming languages. The selection of programming languages depends on user's goal. So variable like defects, STAGES, types of languages and their behavior is studied to find relationship between them. Every language has advantages and disadvantages. It was found that the programming languages are associated with number of defects and hence quality and productivity.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Review on Implementation Technique for Preserving Privacy in Big Data Mining[ ]


The term big data usually describe the exponential growth of data in every sector of the life for both structured and unstructured formatted data.To make use of this large amount of data, the data owners uses the different data mining techniques to extract knowledge at the same time they have to compromise the privacy of their client [4]. For this purpose, in this paper we have define the Privacy Preserving Big Data Mining in which we define the technique for extracting the useful information from the large set of Big Data and to protect it from accessing by unauthorized users we have define some security techniques. A number of methods and techniqueshave been developed for privacy preserving data mining. In this paper, we try to provide a complete review on PPDM for Big data anddifferent techniques such as data partition,distributed storageusing HadoopMapReduce technique using the HDFS distributed file system which could be effectively used to preventthe data access from unauthorized users. Our approach has become increasingly popular because itallows sharing of Sensitive data for analysis purposes. Several data mining algorithms along with incorporating privacypreserving mechanisms that have developed allows one to extract relevant knowledge from large amount of useful data. At the same time ithide sensitive data or information from disclosure or inference.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Preserving confidentiality of user’s data in Cloud Computing : A Review[ ]


Cloud computing services is among the greatest emergent sections of the IT sector due to facial appearance such as attractive expenditure savings for consumers, convenience & consistency alternative for consumers and scalable sales for contributor, particularly as recession-hit institutions are progressively more cost awake. The unrelenting and increasing use of such cloud-based computing resolution is shifting the way IT privileged identify, moderate and communicate the presented and impending future risks of the various cloud models. Cloud computing is a double-edged rapier from the privacy and security perspective. In the face of its forthcoming to provide a near to the ground cost security, business may increase jeopardy by accumulating susceptible data in the cloud. This paper describes the security architecture and discusses how diverse types of attacks are work against by the anticipated architecture. The data stored in the cloud does not have protection.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Secure access of data in Intranet using Web Services[ ]


The Internet is a wide area network that connects computers, servers, and other networking devices. An intranet is used by organization or companies which prevents hacker intrusions a protected by a firewall. Intranets permit users to undertake the same responsibilities that can be done on the Web, such as posting documents, sending e-mail and chatting with other users. In addition, intranets are used as vehicles for keeping all workers up to date on institute or company policies and information.The Internet is a network of networks that is owned by no one and everyone. The key to understanding Intranetsafety involves recognizing the important differences between Intranets and the Internet and the several co-operation options that virtual networks offer. The specific security threats of Intranets can be found in communications, software, and data and operations security

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Review of Domain Evaluation of Trustworthiness in Online Social Networks[ ]


Online Social Networks (OSNs) are a fertile medium through which users can unleash their opinions and share their thoughts, activities and knowledge of various topics and domains. This medium allows legitimate users as well as spammers to publish their content, leveraging the open environment and fewer restrictions associated with OSNs. Hence, it is essential to evaluate users’ credibility in various domains and accordingly make judgements about potentially influential users in a particular domain(s). Most of the existing trustworthiness evaluation approaches of users and their posts in OSNs are generic-based approaches. There is a lack of domain-based trustworthiness evaluation mechanisms. In OSNs, discovering users’ influence in a specific domain has been motivated by its significance in a broad range of applications such as personalized recommendation systems and expertise retrieval. The aim of this paper is to present a preliminary approach to evaluating domain-based user’s trustworthiness in OSNs. We provide a novel distinguishing measurement for users in a set of knowledge domains. Domains are extracted from the user’s content using semantic analysis. In order to obtain the level of trustworthiness, a metric incorporating a number of attributes extracted from content analysis and user analysis is consolidated and formulated considering temporal factor. The approach presented in this paper is promising since it provides a fine-grained trustworthiness analysis of users and their domains of interest in the OSNs.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Review of Trusted Cloud Computing Platform Security[ ]


Cloud computing infrastructures enable companies to cut costs by outsourcing computations ondemand. Security and privacy are two prime barriers to adoption of the cloud computing. Distributed Trusted Computing Platform (DTCP) model can improve the cloud computing security and will not bring much complexity to users. To address this problem on Infrastructure-as-a-Service model, a trusted cloud computing platform model has been proposed to provide a closed box execution environment that guarantees confidential execution of guest virtual machines. In this paper, we deal with different infrastructure level attacks and through the use of trusted cloud computing platform we provide a distributed solution to implement it.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Quality based service composition using security token negotiation[ ]


The union of Services Computing gains a large space of opportunities to compose ?situational? web applications from web-delivered services. However, the large number of services and the complication of composition constraints make physical composition complicated to application developers, who strength is non-professional programmers or even end-users.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Web Application For Sentimental Analysis Using Machine Learning[ ]


Sentiment Analysis is the branch of natural language processing. It deals with the text classification in order to determine the intention of the author of the text. The intention can be of admiration (+ve) or criticism (-ve) type. This paper presents a comparison of results obtained by applying Naive Bayes (NB) and Support Vector Machine (SVM) classification algorithm. These algorithms are used to classify a sentimental review having either a positive review or negative review. Sentiment analysis is now in focus of companies to extract information from customer reviews. Usually the analysis classification is positive, negative and neutral. In this research we focus on the reviews for electronic products. The demographic and technical expertise level of consumers and reviewers are diverse hence there is difference in the way they review a product. Some review contains technical solutions, improvised ways of tackling problems, frustrations, joy etc. This difference in review calls for a wider classification scheme to contain these differences. We thereby introduced a five classification scheme namely positive, negative, advice, no sentiment and neutral at the sentence. We crawled data from amazon.com and used open source natural language processing tools to get the sentiment out of the review.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Improvement in Accuracy of Software Estimation using T3 an Improved Decision Tree Classifier[ ]


Software estimation data from the past history of the projects are valuable resources from which accurate and potential useful knowledge can discovered by using data mining. Data mining can assist and support the software engineer for accurate cost estimation of project and investigate research.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Knowledge Sharing in Collaborative Environment[ ]


In collaborative environments, members may try to obtain similar information on the Web in order to gain knowledge in one domain. For instance, in a company several departments may consecutively need to buy business intelligence software and employees from these departments might have studied online about several business intelligence tools and their features separately. It will be productive to get them connected and share learned knowledge. Many organizations have collected and stored vast amount of data. But, they are not able to discover valuable information concealed in the data by transforming these data into valuable and useful knowledge. Knowledge-sharing activities can improve access to information, ease communications with colleagues, and encourage participation in learning and decision-making communities.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Security Issues in Data Mitigation and Mining: An Overview[ ]


The improvising trends in information technology has enabled collection and processing of vast amounts of personal information, such as shopping habits, criminal records, credit and medical history, and driving records. This data is undoubtedly very useful in many areas, including medical research, national security and law enforcement. Privacy is commonly seen as the right of each user to control information about themselves. However, there is an increasing public concern about the individuals ' privacy. In this paper, we view the privacy issues related to data mining from a wider perspective and investigate various remedies that can help to protect sensitive data. In particular, we identify four different types of user roles involved in data mining applications, namely, data provider, data collector, data miner, and decision maker. For each kind of user, we focus on his privacy issues and techniques to protect sensitive Knowledge.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Review on Aspect Detection in Social Networking Sites for Graphical Data[ ]


Aspect detection is now getting interest by the rapid growth of social network. The approach conventional term frequency may not good in this context, as information exchanged to social sites contain not only text but also images, videos and URLs. We focus on emergence of topics signaled by social aspectof these networks. Especially we focus on reply or mentions of user that is a link between users that generated dynamically through replies or mention and retweets. So we propose a probability model of the mentioning behaviorof social network user andpropose to detect the new topics from the anomalies measured through the model. Finally aggregating anomalyscore from number of users we can show that we can find out the new topic base on the reply on mention relationship in social network.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
[1] [2] [3]