Medha 2013 -National Level Conference on Recent Trends


Google Glass[ ]


Google has developed a wearable computer with an optical head-mounted display (OHMD) the research and development project Project Glass with the intension of producing a mass-market ubiquitous computer. Glass displays information in a hands-free format which can interact with the Internet through natural language voice commands. The Google Glass will have the combined features of virtual reality and augmented reality. It works on Google’s Android Operating System. It also uses other technologies such as4G, EyeTap, Smart Clothing, Smart Grid. Google Glass is a futuristic gadget we’ve seen in recent times. It will prove as a useful technology for all kinds of people including handicapped/disabled.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Performance Analysis of Routing Protocols for Mobile Ad-hoc Networks[ ]


This study is a comparison of three routing protocols proposed for wireless mobile ad-hoc networks. The protocols are: Destination Sequenced Distance Vector (DSDV), Ad-hoc On demand Distance Vector (AODV) and Dynamic Source Routing (DSR). Extensive simulations are made on a scenario where nodes moves randomly. Results are presented as a function of a novel mobility metric designed to reflect the relative speeds of the nodes in a scenario. Furthermore, three realistic scenarios are introduced to test the protocols in more specialized contexts. In most simulations the reactive protocols (AODV and DSR) performed significantly better than DSDV. At moderate traffic load DSR performed better than AODV for all tested mobility values, while AODV performed better than DSR at higher traffic loads.. Mobile ad-hoc networks have been the focus of many recent research and development efforts Combinations of wide range and short range ad-hoc networks seek to provide robust, global coverage, even during adverse operating conditions.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Analysis Of Three-Dimensional Password Scheme[ ]


In this paper, We analyze and evaluate our contribution which is a new scheme of authentication. This scheme is depend on a virtual three-dimensional environment. Users negative through the virtual environment and interact with items inside the virtual three dimensional (3Dimentional) environment . The combination of all interactions, inputs and actions towards the items and the inputs action and interaction these are virtual three environment and it constructs the user’s 3Dimentional password.and these 3Dimentional password combines most authentication schemes such as biometrics,textual passwords ,graphical passwords into one virtual three-dimensional environment. The important application of 3D password’s is provide the protection of critical resources and systems.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Wireless Network Security[ ]


The pervasive availability and wide usage of wireless networks with different kinds of topologies, techniques and protocol suites have brought with them a need to improve security mechanisms. The design, development and evaluation of security techniques must begin with a thorough analysis of the requirements and a deeper understanding of the approaches that are practical within the system constraints. In this paper, we investigate the recent advances in wireless security from theoretical foundations to evaluation techniques, from network level management to end user trust inference and from individual protocol to hybrid systems. We identify the open security issues associated with trust, management, interoperation and measurement. These problems, whose solutions are different in nature and scale from their companions in wired networks, must be properly addressed to establish confidence in the security of wireless networking environments.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey On Secure Authentication With 4-d Password In Cloud Computing[ ]


Cloud computing is an emerging frontier in field of computer technology in which the data and services resides in massively scalable data centers in the cloud and can be accessed from any connected devices over the internet. The advantages offered by cloud computing make it today’s most existing technologies which is lower cost associated with computing while increasing flexibility and scalability for computer process, improved performance, increase computer power, unlimited storage capability etc. Cloud-computing is now-a-day one of the fastest growing part of IT industry. But it has bigger concern about its security. It is certainly possible that cloud system can be hacked and cloud-based documents accessed by unauthorized user. Due to this to access those cloud services only by authorized user we need a more secure authentication technique. In cloud computing authentication is done by many ways it could be 3-D password Authentication, Biometrics base authentication. 3-D Password authentication support multifactor authentication .3-D Password combines all those existing authentication schemes into a single 3-D virtual environment. There are several objects or items contained by 3-D virtual environment in which user has to interact with it. This type of interaction performed with many items and it varies from one item to another. The attackers from different backgrounds can break the system using 3-D password so strengthen it by adding a four dimensional password in it. This paper presents a study of the 3-D password and an approach to strengthen it by adding a 4-D password, which contain gesture recognition and time recording, and that would help in strengthen the authentication. Hence here we are using 4-D password using the 3-D password authentication.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey on Identification topology through custom personal touch[ ]


Today’s identification and authentication mechanisms for touchscreen-enabled devices are cumbersome and do not support brief usage and device sharing. Touch-based personal tokens would let devices unobtrusively identify who is interacting with the device at any given time. Devices could then tailor services to users and control access to sensitive information and online services. The use of devices such as smart phone, tablet etc. for storing sensitive information and accessing online services is increasing. At the same time, methods for authenticating users into their devices and online services that are not only secure, but also privacy and user-friendly are needed. In this seminar an approach for using a wearable personal token, in the form of a ring, to send an identification code to devices through touch. In conjunction with touchscreen receivers this ring serves as a user identification token that would be diversely compatible with a myriad of applications, such as user authentication for login, gaming, parental control, near-field communication (NFC) applications, etc. The ring will also serve as a single token or 'key' that would provide a unified access to multiple devices belonging to the user.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Review of Li-Fi Technology: New Future Technology- Light Bulb to Access the Internet[ ]


Now a day with the advent of technology, communication became the backbone of Information and Communication Technology. ICT is made our globe like a town. Today everyone like Business, institutions ,organizations want s right information at the right time Which, needs fast internet connectivity. Present paper reflects the Future of Communication (LI-FI) which may affect all lives. Li-Fi technology that may be approximately as fast as 500MBPS (30GBPS per minute) an alternative, cost effective and more robust, and useful than Wi-Fi Technology. It is the fast and low cost wireless communication system which is optical version of the Wi-Fi. The Visible light communication may be the future of Internet .Harald Haas has come up with a solution he calls “data with the help of illumination” - the fiber out of fiber optic cabel by sending data through LED. lights that varies in intensity faster than the human eye can follow. In future where data for personal computers ,laptops, smart phones, and tablets is transmitted through the light in a room. And security would be snap - if you can not see the light, you can not access the data through internet.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey on State of the art and future developments of measurement applications on smartphones[ ]


The modern smartphones contain different sensor technologies, so they were used as stand-alone measurement instruments on a wide range of application domains. The survey of measurement applications based on smartphones. In the begining part, the evolution of mobile phone technologies, including the mobile networks and sensors developments. Then, in order to highlight the sensors and the communication capabilities, the architectural overview of the hardware and software technologies, are available on latest series of smartphones. In the ending part, the integration of augmented reality to the measurement applications and new type of measurement systems, having a smartphone as processing support, is presented

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Privacy-aware traffic monitoring In Cloud Computing[ ]


Cloud security becomes one of the major barriers of a widespread adoption of conventional cloud services. anticipate that the same problems will be present in VCs. In a VC, underutilized vehicular resources including computing power, storage, and Internet connectivity can be shared between drivers or rented out over the Internet to various customers. Clearly, if the VC concept is to see a wide adoption and to have significant societal impact, security and privacy issues need to be addressed. The main contribution of this work is to identify and analyze a number of security challenges and potential privacy threats in VCs. Although security issues have received attention in cloud computing and vehicular networks, we identify security challenges that are specific to VCs, e.g., challenges of authentication of high-mobility vehicles, scalability and single interface, tangled identities and locations, and the complexity of establishing trust relationships among multiple players caused by intermittent short-range communications. Additionally, we provide a security scheme that addresses several of the challenges discussed. Vehicles often communicate through multihop routing. A request response will include multiple participants, including users, infrastructure, servers, platform, application, and key generator and privacy agent.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Security Model for Wireless Computer Network[ ]


Implementing security model for wireless computer network we require effective Wireless intranet setup, many models are in working to function. This thing is focused at developing a security model to secure a Wireless Computer network of any institution. The model will develop to secure a Wireless Computer class-room through an authentication server by supplying authentication constraint at registration process, which is used at login for comparison then it will store. Fingerprint is used to make sure that a user is who claims to be. Time duration for access is allotted for a user, after which primary constraint will supplied for re-authentication. While a user is still logged-on, some security questions will pose intermittently to avoid counterfeiting. The methodology used for this research will be Structured System Analysis and Design (SSAD). For coding the program Java Programming Language will use and MySQL as a database. The final result of the system will secure model that guarantees secure access. This is different from the security of other wireless virtual class-room which uses only users name, pin or registration number.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Study of Face Detection for Wireless Secure Data Transmission[ ]


Security of data is main goal for to days communication purposes. This paper presents an idea about an efficient system that provides a sound security for data transfer from one user to another user in an efficient way using Radio Frequency Identification Technology along with Face Detection technique. This two way security approach will guarantee both the parties which are sender and receiver about their confidential data while maintaining a flexible communication approach

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Phishing Susceptibility and Anti-Phishing Security Strategies-Literature Review[ ]


Phishing is a con game that scammers use to collect personal information from unsuspecting users. The false e-mails often look surprisingly legitimate and even the Web pages where users are asked to enter their information may look real. Phishing is similar to fishing in a lake, but instead of trying to capture fish, phishers attempt to steal personal information. This paper gives brief information about phishing, its attacks, steps that users can take to safeguard their confidential information. Phishing has seen an alarming trend of increase in both the volume and the sophistication of phishing attacks. This paper also shows a survey conducted by netcraft on phishing. Phishing is a form of online identity theft that aims to steal sensitive information such as online banking passwords and credit card information from users. The effectiveness of this paper is to examined in a large-scale dataset collected from real phishing cases and to study..different..approaches..for..handling..phishing..activities. Anti-Phish, that aims to protect users against spoofed web site-based phishing attacks. To this end, Anti-Phish tracks the sensitive information of a user and generates warnings whenever the user attempts to give away this information to a web site that is considered untrusted. Anti-Phish is based on the premise that for inexperienced, technically unsophisticated users, it is better for an application to attempt to check the trustworthiness of a web site on behalf of the user. Unlike a user, an application will not be fooled by obfuscation tricks such as a similar sounding domain name. In this paper various methods reviewed to confront those challenges, such as Textual and Visual Content-Based Anti-Phishing: A Bayesian Approach, Image-Based Page Matching And Page Classification with anti-phishing strategies.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Securing Underwater Wireless Communication Networks-Literature Review[ ]


Underwater wireless communication networks are particularly vulnerable to malicious attacks due to the high bit error rates, large and variable propagation delays, and low bandwidth of acoustic channels. The unique characteristics of the underwater acoustic communication channel and the differences between underwater sensor networks and their ground-based counterparts require the development of efficient and reliable security mechanisms. In this paper a complete survey of security for UWCNs is presented, and the research challenges for secure communication in this environment are outlined

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Big Data - an opportunity and challenge for E-commerce[ ]


In today’s world internet is one of the necessities of the human life. Almost all the systems in the world are online now. E-commerce is also one the hot buzz in today’s world. As the size of E-commerce, trade is increasing so is the opportunity and problem associated with it. One of the major problems associated with the E-commerce is the data, which is produced and need to be process for effective business intelligence. This paper discusses the Big Data’s opportunity and problem for the E-commerce.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
An Era Of Mobile Computing In Networks[ ]


This paper outlines and put some light on ‘Mobile computing in networks’. This paper also represents the advances in area of mobile com-puting networks, such as wireless networking. Advances in wireless networking have prompted a new concept of computing, called mobile computing in which users carrying portable devices have access to a shared infrastructure, independent of their physical location. This provides flexible communication between people and (ideally) continuous access to networked services Mobile Computing is a term used to describe technologies that enable people to access network services anyplace, anytime, and anywhere. Mobile computing is power full revolution and the way computers are used and in the coming years, this will become even more perceptible although many of the devices themselves will become smaller or even invisible to users to use.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Survey On Content Base Image Retrieval System[ ]


this paper shows a novel framework for combining all the three features color, texture and shape information. Also it achieves higher retrieval efficiency using dominant color feature. The image and its complement are divided into non-overlapping tiles of same size. The features drawn from co-occurrence conditional histograms between the image tiles and corresponding complement tiles. In RGB color space, act as local descriptors of color, shape and texture. In this paper apply the integration of the above combination, then cluster based on like properties. Based on dominant colors and retrieve the similar images. The information of Image is captured in terms of edge images this image computed using Gradient Vector Flow fields. Constant moments are then used to store the shape features.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Data Security in Unreliable Cloud Using Access Control and Access Time [ ]


Cloud computing has been envisioned as the next-generation architecture of IT activity. In difference to established solution, wherever the IT services are under appropriate physical, logical and personnel controls, cloud computing moves the appliance software and databases to the large data centers, wherever the management of the data and services may not be fully reliable.This unique attribute, however, poses many new security challenges which have not been well understood. In this project, focus on cloud data storage security, which has constantly been an important feature of quality of service Data owner’s stores encrypted data in the cloud to ensure security for his data in the cloud computing environment and issues decryption key to only authorized user to access the data from cloud. As user is revoked, data owner has to re-encrypt the data so that revoked user cannot access the data again .To perform this operation data owner will issue re-encryption command to cloud so that data in cloud gets re-encrypted. Once re-encryption is done here is a need for generation of new decryption keys to legal user, so that they can go on to access the data. Within a cloud computing environment all such commands may not be received and executed by all of the cloud servers due to unreliable network communications.To solve this problem we are proposing time-based re-encryption scheme. In this method automatic re-encryption of data will takes place based on the internal clock value present at the cloud server. To execute this automatic re-encryption we will make use of encryption technique called Attribute Based Encryption (ABE) with DES (Data Encryption Standard). ABE provides fine -grain access control and easier user revoking system and DES will provide Encryption technique.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
The Secure Dynamic Source Routing Protocol in MANET to authenticate the node[ ]


The Dynamic Source Routing protocol (DSR) is a simple and efficient routing protocol designed specifically for use in multi-hop wireless ad hoc networks of mobile nodes. DSR allows the network to be completely self-organizing and self-configuring, without the need for any existing network infrastructure or administration. The protocol is composed of the two mechanisms of Route Discovery and Route Maintenance, which work together to allow nodes to discover and maintain source routes to arbitrary destinations in the ad hoc network. The use of source routing allows packet routing to be trivially loop-free, avoids the need for up-to-date routing information in the intermediate nodes through which packets are forwarded, and allows nodes forwarding or overhearing packets to cache the routing information in them for their own future use. All aspects of the protocol operate entirely on-demand, allowing the routing packet overhead of DSR to scale automatically to only that needed to react to changes in the routes currently in use.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Comparitive Study of Bar code & Quick Response Code & its Security Issues [ ]


In this paper, the main discussion is about the comparative study of bar code technique and the technique QR Code ie Quick response code for the proper monitoring & transferring of data. Bar codes have low reading speed and accuracy. As bar codes became popular and their convenience universally recognized, the market began to call for codes capable of storing more information, more character types, and that could be printed in a smaller space. As a result, various efforts were made to increase the amount of information stored by bar codes, such as increasing the number of bar code digits or layout multiple bar codes. However, these improvements also caused problems such as enlarging the bar code area, complicating reading operations, and increasing printing cost. Quick Response Code recognition using mobile, which is an efficient technology used for data transferring. To solve the problem of low reading capacity and accuracy, QR code is developed. It has high encoding capacity with high reading speed and Code can be readable. QR-code stand for Quick Response Code, which is well known 2 dimensional bar code industrial as it, have high efficiency in accuracy and reading speed. From any direction

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
2nd Generation and 3rd Generation mobile Services[ ]


3G services represents an evolution in telecommunication industry. However, 3G services has not received great adoption rate as expected despite of various benefits provided by this service. This study aims to investigate the factors affecting the intention to adopt 3G services among the university students in Malaysia since they expected to be the group with great potential to adopt 3G services. Diffusion of innovation theory is modified and applied in this study to achieve the objective. Results of this study show that perceived compatibility, perceived relative advantage, perceived results demonstrability, perceived trailability, perceived image, and perceived enjoyment are significantly associated with intention to adopt 3G services. Surprisingly, perceived cost of using 3G services is found to be positive but insignificant associated with intention to adopt 3G services. Managerial implications and conclusion have been discussed. All mobile operators with both 2G and 3G license require the maximum reuse of the existing infrastructure when building the 3G-network. Because of the importance of a seamless mobile network, Ericsson has been involved and working very much with definition of the interfaces, new protocols and to create a solution for integration of the 2G and 3G mobile systems. Ericsson as a leading company in the telecommunication field is active in all standardization forum, including ITU, 3GPP, etc.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
3G & 4G Service Architectures[ ]


These i3G services represent an evolution in telecommunication industry. However, 3G services has not received great adoption rate as expected despite of various benefits provided by this service. This study aims to investigate the factors affecting the intention to adopt 3G services among the university students in Malaysia since they expected to be the group with great potential to adopt 3G services. Diffusion of innovation theory is modified and applied in this study to achieve the objective. Results of this study show that perceived compatibility, perceived relative advantage, perceived results demonstrability, perceived trailability, perceived image, and perceived enjoyment are significantly associated with intention to adopt 3G services. Surprisingly, perceived cost of using 3G services is found to be positive but insignificant associated with intention to adopt 3G services. Managerial implications and Conclusion has been discussed

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Extending Cloud Computing on Mobile Platform[ ]


Today there is an enormous growth in the number of mobile phone users and also in the mobile applications, together with the emerging of Cloud Computing (CC), the concept of Mobile Cloud Computing (MCC) has been introduced. MCC is the Combination of Cloud Computing and Mobile Computing. The emerging cloud computing technology offers a natural solution to extend the limited capabilities of mobile devices. Cloud facilities appear as a promising option that enables mobile devices-clients to offload their tasks to remote cloud servers. The term "Mobile Cloud Computing" basically, refers to an infrastructure where both the data storage and the data processing happen outside of the mobile device. Mobile cloud computing can be applied to the applications installed in mobile devices to increase the processing speed and optimize operations to attain efficient results.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Novel Approach for Detection techniques for CD and CC in digital forensics[ ]


Now a day’s there is great attention in the accounting and cyber crime fields because of government regulations in the whole world [1]. Although these regulations force corporations to provide financial transparency, they still commit accounting frauds such as tax evasion. Moreover, companies have substituted paper-work with IT systems such as DBMS, EDMS, and ERP system. So there is a need to focus the attention on discovering financial information in a database server. However, frauds are difficult to observe and detect because the perpetrators did their best to conceal their fraudulent activities. In particular, there is need to consider the case of a covert database server. Secondly a network covert channel is a communication channel that allows two cooperating processes to transfer information on the network in a manner that violates the system's security policy. So for covert channel, in order to solve the problem one detection algorithm can only detect one kind of network covert channel, the detection approach hierarchical and density based cluster was purposed. Because the coding scheme of the covert channel would cause many similar data occurred repeatedly, the detection algorithm cluster based on density can be used to detect several kinds of the covert channels. Moreover, the detection approach cluster based on hierarchy and density is able to tackle of detection a noisy channel. The algorithm can work well to distinguish the covert channel from normal network traffic. This paper proposes a novel approach for detection of covert database server and covert channel,

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Improving the Throughput of ICTCP in Data Center Network[ ]


Transport Control Protocol (TCP) incast congestion happens when number of senders work in parallel with the same receiver where the high bandwidth and low latency network problem occurs. For many data center network application such as search, the heavy traffic is present on such server. Incast congestion degrades the performance as the packets are lost at server side due to buffer overflow and hence the response time will be more. To improve the performance this report is focusing on the TCP throughput, RTT, receive window. Our method is to adjust the TCP receive window proactively active before packet loss occurs. Our aim is to avoid the wastage of bandwidth by adjusting its size as per the number of packets. To avoid the packet loss the ICTCP algorithm has been implemented. The algorithm has been implemented in the data center network which is ToR

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Providing Security challenges in 4G Systems[ ]


Several research groups are working on designing new security architectures for 4G networks such as Hokey and Y-Comm. Since designing an efficient security module requires a clear identification of potential threats, this paper attempts to outline the security challenges in 4G networks. A good way to achieve this is by investigating the possibility of extending current security mechanisms to 4G networks. The results show that due to the fact that 4G is an open, heterogeneous and IP-based environment, it will suffer from new security threats as well as inherent ones. In order to address these threats without affecting 4G dynamics, we propose an integrated security module to protect data and security models to target security on different entities and hence protecting not only the data but, also resources, servers and users.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Review On Architecture Driven Modernization - ADM[ ]


Today there are things which gets older but as it gets older, the importance of these thing also increases. The reason for this is the operations they are capable to perform. In todays scenario, the need for organizations to use their old systems called as legacy systems is rapidly increasing, to cop-up with this situation there must be some way to deal it. So, one solution that is available is the modernization of the existing systems, which ultimately helps in the proper usage of existing systems. The concept which hepls us to understand and makes system reusable is the "Architecture Driven Modernization", through this we can transform the architecture of the existing system to the new one as per needs. Architecture, because it almost explains all the functionality of the system. But the complexity of architectures varies from different parts to different elements. Therefore in following paper we tried to explain why to use ADM to modernize the system. To make it possible, one concept that is used most often is knowledge discovery metammodel. KDM specification By using both scenarios, we are eyeing to make it possible to modernize the existing system without increasing the complexity and maintaining the interoperability along with language and platform independance.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Computer Network Security[ ]


Computer Network security is the effort to create a secure computing platform, designed so that users or programs cannot perform actions that they are not allowed to perform, but can perform the actions that they are allowed to. The actions in question can be reduced to operations of access, modification and deletion. Computer Network security can be seen as a subfield of security engineering, which looks at broader security issues in addition to network security. However, as more and more people become ``wired'', an increasing number of people need to understand the basics of security in a networked world. This document was written with the basic computer user and information systems manager in mind, explaining the concepts needed to read through the hype in the marketplace and understand risks and how to deal with them. It is hoped that the user will have a wider perspective on computer network security in general, and better understand how to prevent, detect, and take counter measures on such threats personally, at home, and in the workplace.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
(Light Fidelity)-The future technology In Wireless communication[ ]


In simple terms, Li-Fi can be thought of as a light-based Wi-Fi. That is, it uses light instead of radio waves to transmit information. And instead of Wi-Fi modems, Li-Fi would use transceiver-fitted LED lamps that can light a room as well as transmit and receive information. Since simple light bulbs are used, there can technically be any number of access points. Whether you’re using wireless internet in a coffee shop, stealing it from the guy next door, or competing for bandwidth at a conference, you’ve probably gotten frustrated at the slow speeds you face when more than one device is tapped into the network. As more and more people and their many devices access wireless internet, clogged airwaves are going to make it increasingly difficult to latch onto a reliable signal. But radio waves are just one part of the spectrum that can carry our data. What if we could use other waves to surf the internet? One German physicist,DR. Harald Haas, has come up with a solution he calls “Data Through Illumination”—taking the fiber out of fiber optics by sending data through an LED light bulb that varies in intensity faster than the human eye can follow. It’s the same idea behind infrared remote controls, but far more powerful. Haas says his invention, which he calls D-Light, can produce data rates faster than 10 megabits per second, which is speedier than your average broadband connection. He envisions a future where data for laptops, Smartphone’s, and tablets is transmitted through the light in a room. And security would be a snap—if you can’t see the light, you can’t access the data. Li-Fi is a VLC, visible light communication, “Li-Fi is typically implemented using white LED light bulbs. These devices are normally used for illumination by applying a constant current through the LED. However, by fast and subtle variations of the current, the optical output can be made to vary at extremely high speeds. This technology uses a part of the electromagnetic spectrum that is still not greatly utilized- The Visible Spectrum. Light is in fact very much part of our lives for millions and millions of years and does not have any major ill effect. Moreover there is 10,000 times more space available in this spectrum and just counting on the bulbs in use, it also multiplies to 10,000 times more availability as an infrastructure, globally.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Geothermal Vapour Absorption Refrigeration System[ ]


NA

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
5 Pen PC Technology[ ]


P-ISM (“Pen-style Personal Networking Gadget Package”), which is nothing but the new discovery, which is under developing, stage by NEC Corporation. P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Case Study In Global Autosafety Systems In Automobiles[ ]


NA

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Comparison of Table Driven and Source Begin protocol of Mobile Infrastructure less Network: A Review[ ]


Highly capable routing is an important issue for the design of wireless network protocols to meet the severe hardware and resource constraints. In the past few decades due to the colossal use of wireless network, important and challenging area of research is the field of Infrastructure less routing. The algorithm working in such environment needs to find the routing solution online. Various protocols has been anticipated for different type of network including ad-hoc, wireless. Routing protocol basically can be categorized in two provinces as proactive and another are reactive routing protocol. This paper provides the overall survey of the entire routing protocol. It started by giving detail description of problem domain. And then this paper provides the comparison of routing protocol for infrastructure less routing

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Mathematical Modelling & Wireless Communication Network - A Review[ ]


Introduction Of the several kinds of creative activity being promoted in contemporary developments ,arguably the most empowering for students is mathematical modelling. A good deal has been written about modelling as a classroom teaching and learning strategy. In the last few decades, there have been abundant discussions among mathematicians and mathematics educators on promoting mathematical modelling (a process of using mathematics to tackle real world problems) as a classroom practice. Despite the consensus on its importance and relevance, mathematical modelling remains a difficult activity for both teachers and learners to fully engage in. In this paper, we examine some of these difficulties and discuss how technology can play a pivotal role in providing the essential support to make mathematical modelling a more accessible mathematical activity amongst students. Through a series of examples drawn from different fields and topics .The paper is devoted to study and evaluation of the erroneous packets flow on the physical layer of a wireless communication network. A mathematical model of the erroneous packets passed to the communication channel has been structured and analysed. The statistical estimation of the erroneous packets number is presented and discussed in the paper. In wireless sensor networks, sensor nodes are randomly distributed of high-density. This paper formulates a mathematical model in which only if some parameters are known, the number of nodes in order to reach certain coverage fraction can be calculated. Simulation results show that the simulation results can be able to calculate the number of nodes in little error compared with the analytical results

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
An Approach towards Link Positions Routing in Wireless Network-A Review[ ]


Here we account for the fact that MAC protocols incorporate a finite number of transmission attempts per packet. The performance of a path depends not only on the number of the links on the path and the quality of its links, but also, on the relative positions of the links on the path Based on this observation, we propose ETOP (Expected number of Transmissions On a Path), a path metric that captures the expected number of link layer transmissions required for reliable end-to-end packet delivery. We can analytically compute ETOP, which is not trivial, since ETOP is a noncommutative function of the link success probabilities. Although ETOP is a more involved metric, we show that the problem of computing paths with the minimum ETOP cost can be solved by a greedy algorithm. We will try to implement and evaluate a routing approach based on ETOP metric on wireless network

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Performance on RFID Based Automatic Toll Tax Deduction System [ ]


These RFID, radio frequency identification, is an automatic identification technology which identifies individual physical object according to unique IDs recorded by the RF Tags attached on this objects. In this research paper we examine RFID based toll deduction system and how to make more efficient and perfect. The vehicle will be equipped with a radio frequency (RF) tag which will detect RF Reader located in on toll plaza. The amount will then automatically deduct from the bank account. This research paper can be considered scalable to implement in motor vehicles used today.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Review of Mobile Ad Hoc Network Attacks[ ]


Security is an essential requirement in mobile ad hoc network (MANETs). Compared to wired networks, MANETs are more vulnerable to security attacks due to the lack of a trusted centralized authority and limited resources. Attacks on ad hoc networks can be classified as passive and active attacks, depending on whether the normal operation of the network is disrupted or not. In this paper, we are describing the all prominent attacks.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Production of Ethanol by waste[ ]


A biofuel is a fuel that uses energy from a carbon fixation. These fuels are produced from living organisms. Biofuel holds significant promise as a potential fuel displacement. our aim is to produce second generation bioethanol from the organic fraction of Municipal Solid Waste (MSW). Since now feedstock to produce biofuels is affecting the food market, but recently the prices of this market have increased, therefore reducing the profitability of the bioethanol projects. The composition of organic fraction of MSW is similar to lignocellulosic materials. The cellulose and hemicellulose will be hydrolyzed and fermented obtaining bioethanol. In the process the lignin is separated and used to cogenerate the power and heat necessary for the process. It must be taken heed of using waste in biodiesel production process can give weight to developing biodiesel production process. In this we elaborate production, properties and chemical process related to biofuel.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
An Efficient method for the advancement of performance of wireless sensor network[ ]


In this paper, the main focus or emphasis is given on the upgrdation of the performance of the wireless sensor network or it may be a general wireless network .The main issue is towards the monitoring of the delay issue & network lifetime issue.wireless sensor network also suffer from the different problems or vulnerabilities. WSN suffer from attacks due to which there is an impact on the performance of the wireless sensor network.our main aim in this paper is towards the advancement in the performance of the wireless network.Also towards the maximization of network lifetime & minimization of delay term. Wireless sensor networks consists of many sensing nodes that captures the changes in the environment enclose data in data packets and gives these packets to sink node present in the network. A WSN consists of a number of sensors spread across a geographical area. Each sensor has wireless communication capability and sufficient intelligence for signal processing and networking of the data. The network lifetime is usually defined as the time until the first node fails because of energy depletion. So sleep-wake scheduling is effective mechanism to increase network lifetime. Sleep-wake scheduling is efficient to increase network lifetime but it could result insubstantial delays because a transmitting node needs to wait for its next-hop relay node to wake up. Anycast forwarding schemes to forward the data packet to next hop node which minimizes the expected packet-delivery delays from the sensor nodes to the sink node.CMAC avoid the synchronization while supporting the latency.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
HeartID:A Modern way of Authentication[ ]


There is need to develop a powerful mechanism so Instead of using unreliable and hackable passwords, the software, called HeartID, uses a person’s heartbeat to authenticate their identity and protect access to important information. HeartID is more convenient than existing biometric identification techniques; like fingerprints, irises; a cardiac signal requires a simple touch. Cardiac signals are also highly secure and their accuracy is rated at greater than 99 %. “HeartID is currently the only commercially available biometric authentication solution that uses the cardiac signal”. An individual holds a mouse-like controller with built-in sensors, and their unique cardiac rhythm is recognized by the connected computer. Authorized users with a recognized cardiac rhythm are immediately allowed access and logged in. As a further security measure, HeartID uses continuous monitoring, immediately logging off any user without an authenticated cardiac rhythm.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Study paper on SPDY protocol: Let’s make the web faster[ ]


This paper gives short overview of SPDY protocol. Nowadays internet environment is getting sophisticated with the help of advanced web applications and improved connection speeds. But the base technology (protocols in layers) which we are using is not getting updated for today’s weighted web applications and browsers, which has lead to slow browsing. Hence Google initiative chromium group has developed an over fixing protocol called SPDY protocol (spelled Speedy protocol) which can fix up to 50% climb of speed in browsing.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Study paper on SPDY protocol:Let’s make the web faster[ ]


This paper gives short overview of SPDY protocol. Nowadays internet environment is getting sophisticated with the help of advanced web applications and improved connection speeds. But the base technology which we are using is not getting updated for today’s weighted web applications and browsers.Hence this leads to slow browsing. Hence Google initiative chromium group has developed a protocol called SPDY protocol (spelled Speedy protocol) which can increase up to 50% of speed in browsing.In this paper we have discussed the framing layer of the SPDY protocol.Also,this paper deals with error handling mechanisms in SPDY.Again,comparisons are made between TLS,HTTP,TCP and SPDY.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Increasing quality of life through Web 3.0[ ]


The Web is entering a new phase of evolution. With the help of Web 3.0 we will use the internet to increase our overall quality of life. Web 3.0 is just that next advancement. Web 3.0 has been also linked to a possible convergence of Service-oriented architecture and the Semantic web. Developments in improved reality, personal learning networks, coding language, and wireless devices will describe the possibilities of Web 3.0. The Semantic Web 3.0 and its Web Services are transforming the Internet from a network of information to a network of knowledge and services. This new phase of evolution has quite a different focus from what Web 2.0 has come to mean. This paper is concerned with Overview of semantic web generatons & deeply description Service web 3.0 is given with its scope and Advantages. Web 3.0 will be smarter in how it retrieves and presents the correct information to participants.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Evaluating renewable energy using data mining techniques in developing India[ ]


Developing country like India expects foreign investments in industrialization to achieve economic growth. This paper deals with the study of opportunities, challenges in using renewable energy resources in India and applying data mining algorithms in the renewable energy source data. The daily temperature values of a city or place acts as data for prediction of rainfall, energy calculation etc. There are weather stations available which records the weather of almost all the region in the earth. The weather data acquired in these stations are raw data and found in abundance. The data mining techniques can be applied to these raw data to acquire meaningful patterns out of it, which can be used to predict rainfall, solar energy availability etc.. These algorithms cover classification, clustering, statistical learning, association analysis, and link mining, which are all among the most important topics in data mining research and development.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
The Flow of Genetic Algorithm in MANET-Case Study[ ]


In this paper, we show how genetic algorithms can be useful in enhancing the performance of clustering algorithms in mobile ad hoc networks. Encoding the individual chromosomes is an essential part of the mapping process. Each individual may represent one or more chromosomes and each chromosome contains information about the clusterheads and the members there of, as obtained from the original WCA. The genetic algorithm then uses this information to obtain the best solution (chromosome) defined by the fitness function. The proposed technique is such that each clusterhead handles the maximum possible number of mobile nodes in its cluster in order to facilitate the optimal operation of the medium access control (MAC) protocol. The goal consists of this paper is allocate near optimal path from source to destination based on time, giving priority to cluster heads to maximize utilization and minimum delay.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Vision Guided Pick and Place Robotic Arm System Based on SIFT[ ]


The work presents a complete vision guided robotic arm system for picking and placing of objects. For object recognition and localization purpose, the approach exploits Scale Invariant Feature Transform keypoint extraction to segment the correspondences between the object model and the image onto different potential object instances with real-time performance. Form the obtained correspondences, the best picking point is estimated. The coordinates of this point is then transferred to robotic arm coordinate system, which allows the arm to pick and place the object to the desired locations. The use of SIFT based clustering allows the system to be used for applications under extreme conditions of occlusion, where standard appearance-based approaches are likely to be ineffective. This system overcome most of the challenges occurred in actual workspace and in real time applications by presenting sufficient speed of operation. The system can handle complex ambient illumination conditions, challenging specular backgrounds, condition of occlusion, diffuse, and transparent as well as specular objects efficiently. The system can work with a single view of object, reducing the time and complexity to create database. The work presented is simple and fast for practical implementation as well as low cost. Many measures of efficacy and efficiency are provided on random disposals of objects, with a specific focus on real-time processing. Experimental results on different and challenging kinds of objects are reported using the custom designed robotic arm to demonstrate the effectiveness of the technique.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Novel Approach for Image Searching using Visual Reranking Technique[ ]


Rapid advances in computers and communication technology is pushing the existing information processing tools to their limits. The past few years have seen an overwhelming accumulation of media rich digital data such as images, video, and audio. The internet is an excellent example of a distributed database containing several millions of images. Image search has become a popular feature in many search engines, including Google, Yahoo!, MSN, etc., majority of which use very little, if any, image information. Image Retrieval system is a powerful tool in order to manage large scale image databases. Retrieving images from large and varied collections using image content as a key is a challenging and important problem. Due to the success of text based search of Web pages and in part, to the difficulty and expense of using image based signals, most search engines return images solely based on the text of the pages from which the images are linked. No image analysis takes place to determine relevance/quality. This can yield results of inconsistent quality. So, such kind of visual search approach has proven unsatisfying as it often entirely ignores the visual content itself as a ranking signal. To address this issue, we present a new image ranking and retrieval technique known as visual reranking, defined as reordering of visual images based on their visual appearance. This approach relies on analyzing the distribution of visual similarities among the images and image ranking system that finds the multiple visual themes and their relative strengths in a large set of images. The major advantages of this approach are that, it improves the search performance by reducing the number of irrelevant images acquired as the result of image search and provides quality consistent output. Also, it performs text based search on database to get ranked images and extract features of them to obtain reranked images by visual search.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
An Overview on Mobile Ad Hoc Wireless Network & Types of Routing Protocols[ ]


A mobile ad hoc network (MANET) is a self-configuring infrastructureless network of mobile devices connected by wireless. Ad hoc is Latin and means "for this purpose".[1] Or it is a collection of mobile nodes that are dynamically and arbitrarily located in such a manner that the interconnections between nodes are capable of changing on a continual basis. In order to facilitate communication within the network, a routing protocol is use to discover routes between nodes. The primary goal of such an ad hoc network routing protocol is correct and efficient route establishment between a pair of nodes so that messages may be delivered in a timely manner. MANET nodes are typically distinguished by their limited power, processing, and memory resources as well as high degree of mobility.In this course we will focus our attention on overview on MANET and its current routing protocols which provides connectivity in mobile Ad Hoc networks.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Study paper on 4G and MANET, Wireless Network of Battlefield in Future[ ]


4G, is the evolving technology of future wireless networks. The fourth generation of cellular system will provide single interface to all kinds of wireless networks allowing participating nodes to access to the network through cellular, wireless LAN networks. Future wireless network especially in hostile military environment. By combining two hottest wireless network topics, 4G (the fourth generation of cellular communication systems and MANET, explores potentials as well as foreseeable challenges to the wireless communications. 4GM@4GW is the idea of implementing the fourth generation of wireless technology into mobile ad-hoc network in the next generation of military.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Effective Intrusion Detection Approach in Mobile Ad Hoc Networks[ ]


intuitively, intrusions in an information system are the activities that violate the security policy of the system, and intrusion detection is the process used to identify intrusions. It is based on the beliefs that an intruder’s behavior will be noticeably different from that of a legitimate user and that many unauthorized actions will be detectable. Intrusion detection systems (IDSs) are usually deployed along with other preventive security mechanisms, such as access control and authentication, as a second line of defense that protects information systems. Intrusion Detection in MANET is one of the major concerns in peer to peer networking scenario where mobile / wireless nodes communicate with each other without any pre-defined infra-structural setup. This paper presents an overview of various intrusion detection models, identifying its issues, discusses on design and proposes an intrusion detection system shortly, This paper aims to pioneer and to assort current techniques of Intrusion Detection System (IDS) aware MANET. MANET is infrastructure-less, pervasive in nature with multi-hop routing, without any centralized authority. To support these ideas, discussions regarding attacks and researches achievement on MANET are presented inclusively.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey On Lunar Protocol[ ]


The current networking design of the Internet architecture has shown some limitations. Restricted by inherent layering con-straints, valuable networking information cannot ?ow freely inside the network stack and potential operational optimizations are impossible to achieve. To overcome these limitations, we extend the current trend of cross-layer approaches with a framework called underlay protocol fusion: the basic building blocks of Internet functionality are factorized out and merged in a function pool where information sharing and operational optimizations are performed. To illustrate our approach, we present LUNARng (LUNAR next generation). It is a fully distributed underlay protocol de-signed for the Internet integration of wireless ad hoc net- works (MANETs) where fundamental services such as name resolution, address autoconfiguration, and IPv4/IPv6 routing are transparently available whether the MANET is connected or not to the Internet. Internet integration refers here to the ability to insert/remove a MANET into/from the logical organization of the Internet without any loss of functionality. Moreover by using protocol models, the underlay nature of LUNARng allows to optimally merge (with respect to the multi-hop nature of MANETs) network operations which are traditionally carried out at different layers of the protocol stack.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Technique for Energy Consumption in Wireless sensor Network[ ]


Wireless sensor network has been widely used in many sectors. The popularity gained due to some reasons such as installation flexibility, mobility, reducing cost and scalability. In wireless sensor network energy consumption is one of the major issues, thus making energy - efficient protocol is design to overcome the energy consumption problem. In many existing protocols for energy - efficient routing them forward packets through minimum energy path to the sink towards the minimum energy consumption, which cause an unbalanced distribution of residual toward all sensor node in network partitioned. This paper focused on the design energy consumption routing protocol using essential hypothetical fields like depth, energy density & residual energy. The basic goal is that an approach towards force packet to the sink through dense energy area. Also addressing routing problem and enhanced the mechanism of eliminating and detecting of loops. It is also support for data security by using data security algorithm like ECC (Elliptic Curve Cryptography). This paper is important towards in energy balancing, network lifetime and coverage ratio.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Diffusion Strategies Outperform Incremental Strategies Distributed Estimation over Adaptive Networks[ ]


Adaptive networks consist of a collection of nodes with adaptation and learning abilities. The nodes interact with each other on a local level and diffuse information across the network to solve estimation and inference tasks in a distributed manner. In this work, we compare the mean-square performance of two main strategies for distributed estimation over networks: incremental strategies and diffusion strategies. The analysis in the paper confirms that under constant step-sizes, diffusion strategies allow information to diffuse more thoroughly through the network and this property has a favorable effect on the evolution of the network: diffusion networks are shown to converge faster and reach lower mean-square deviation than consensus networks, and their mean-square stability is insensitive to the choice of the combination weights. In contrast, it is shown that consensus networks can become unstable even if all the individual nodes are stable and able to solve the estimation task on their own. When this occurs, cooperation over the network leads to a catastrophic failure of the estimation task. This phenomenon does not occur for diffusion networks: we show that stability of the individual nodes always ensures stability of the diffusion network irrespective of the combination topology.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
General Purpose Six-Stage Pipelined Processor[ ]


This paper proposes design of six stage pipelined processor. The architecture is modified to increase the speed of operation. The architecture of the processor includes the ALU, Pipelined data-path, Data forwarding unit, Control logic, data and program memories and Hazard control unit. Hazard detection unit and data forwarding unit have been included for efficient implementation of the pipeline. Design and verification of processor has been done using Verilog on Xilinx 14.1 platform and ASSEMBLER is written in PERL language which decrease the complexity of instruction writing in program memory is used in this design. As a result this design uses 1168 LUTs and achieves 277.9MHz frequency and it achieves 20% better performance on single-thread program than conventional pipelined processor.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
An Improved Optimum Multilevel Dynamic Round Robin Scheduling Algorithm[ ]


The main objective of this paper is to improve the Round Robin scheduling algorithm using the dynamic time slice concept. One of the fundamental functions of an operating system is scheduling. Sharing of computer resources between multiple processes is called as scheduling. The main use of round robin scheduling algorithms is in the real time systems but still it have several difficulties The intention should be allowed as many as possible running processes at all time in order to make best use of CPU. CPU scheduling has strong effect on resource utilization as well as overall performance of the system. Round Robin algorithm performs optimally in timeshared systems, but it is not suitable for soft real time systems, because it gives more number of context switches, larger waiting time and larger response time. In this paper, a new CPU scheduling algorithm called An Optimum Multilevel Dynamic Round Robin Scheduling Algorithm is proposed, which calculates intelligent time slice and changes after every round of execution. The suggested algorithm was evaluated on some CPU scheduling objectives and it was observed that this algorithm gave good performance as compared to the other existing CPU scheduling algorithms.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Advanced Switch Path First Algorithm for Advanced Lightpath Reservation in Optical Grid[ ]


Grid computing is a recently developing technology which involves the simultaneous usage of different resources connected by network links. This characteristic of grid computing mandates that all the resources involved in the execution of a grid application be available at predetermined times. Advance reservation is a technique which ensures that the request get 100% availability of resources. In optical grid networks, an important technology used currently is the advance reservation of lightpaths. It is a key to guaranteeing that enough wavelengths will be available when needed. Generally, a lightpath is defined as a wavelength data channel which links multiple optical segments. In this work, we propose algorithms to solve the routing and wavelength assignment (RWA) problem for advance lightpath reservation request. The Advanced Switch Path First (ASPF) uses the parallel Dijkstra Algorithm for routing between source and destination and SPF scheduling stratergy for selecting a lightpath among all available lightpaths. This algorithm will be helpful to achieve good speedup ratio, minimizes the blocking probability of request.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Novel Approach towards Defeating Global Eavesdropper in Wireless Sensor Network for Location Privacy using Global Inspector[ ]


Sensor networks are used in variety of application areas to monitor the objects. Privacy is one of the major issues of wireless sensor network as wireless transmissions are susceptible to illegal interception and detection. There are many protocols that provide content-oriented security in wireless sensor network that deals with protecting actual content of the messages but context-oriented information which is related to the actual content of the message eg. location information generally remains insecure. Such context-oriented information can be used by an adversary to infer sensitive information such as the locations of objects monitored and sinks in the network field. No. of techniques exist that are capable of defeating the limited adversary called local eavesdropper who can only observe network traffic in a small region but very few techniques has been proposed to achieve protection against the stronger adversary called global eavesdropper. Existing approaches provides different techniques for Preserving source location privacy and sink location privacy.The proposed technique uses backbone formation algorithm and Global Inspector. Each packet is passed from source to destination through Global Inspector. This approach provides location privacy to the source as well as sinks in the sensor networks. The proposed technique also provides trade-off between privacy and communication cost.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Introduction to data warehousing and data mining[ ]


The Data Warehousing supports business analysis and decision making by creating an enterprise wide integrated database of summarized, historical information. Data mining, the extraction of hidden predictive information from large databases, is a powerful new technology with great potential to help companies focus on the most important information in their data warehouses. Data mining tools predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by retrospective tools typical of decision support systems.This paper describes about the basic architecture of data warehousing, its software and process of data warehousing. It also presents different techniques followed in data mining.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Framework of Data Integrity Verification for Multi Clouds Using CPDP Scheme[ ]


In this paper, we address the construction of an PDP scheme for cloud storage to support the growing amount of work of service and data migration, in which we consider the availability of multiple cloud service providers to cooperatively store and manage the clients’ infprmtion.Cloud computing environment building is based on totally open architectures and interfaces. It has the powe to combine multiple clouds whose services are provided by an IT department to those in its own organization and a public or private cloud services that are provided by a third party outside the organization to gether to provide high ability of organizations to work together. So, Security in terms of integrity is most important aspect in cloud computing environment. Our experiments show that our solution introduces lower computation and communication overheads in comparison with non-cooperative approaches. We prove the security of our scheme based on Zero-knowledge.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Biometrics[ ]


Biometric recognition refers to an automatic recognition of individuals based on feature vector(s) derived from their physiological and/or behavioral characteristic. Biometric recognition systems should provide a reliable personal recognition schemes to either confirmer determine the identity of an individual. Applications of such a system include computer systems security, secure electronic banking, mobile phones, credit cards, secure access to buildings, health and social services. By using biometrics a person could be identified basedon "who she/he is" rather than "what she/he has" (card, token, key) or "what she/he knows" (Password, PIN). In this paper, a brief overview of biometric methods, both Unimodal andMultimodal, and their advantages and disadvantages are presented.The term "biometrics" is derived from the Greek words bio (life) and metric (to measure). Biometrics refers to the automatic identification of a person based on his/her physiological or behavioral characteristics. This method of identification is preferred over traditional methods involving passwords and PIN numbers for its accuracy and case sensitiveness. A biometric system is essentially a pattern recognition system which makes a personal identification by determining the authenticity of a specific physiological or behavioral characteristic possessed by the user. An important issue in designing a practical system is to determine how an individual is identified. Depending on the context, a biometric system can be either a verification (authentication) system or an identification system. Verification involves confirming or denying a person's claimed identity while in identification, one has to establish a person's identity. Biometric systems are divided on the basis of the authentication medium used. They are broadly divided as identifications of Hand Geometry, Vein Pattern, Voice Pattern, DNA, Signature Dynamics, Finger Prints, Iris Pattern and Face Detection. These methods are used on the basis of the scope of the testing medium, the accuracy required and speed required. Every medium of authentication has its own advantages and shortcomings. With the increased use of computers as vehicles of information technology, it is necessary to restrict unauthorized access to or fraudulent use of sensitive/personal data. Biometric techniques being potentially able to augment this restriction are enjoying a renewed interest.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Cloud Storage Security And Privacy Preservation[ ]


Cloud computing relies on sharing of resources to achieve coherence and economies of scale similar to a utility (like the electricity grid) over a network. Using the cloud storage, users store their data on the cloud without the burden of data storage and maintenance and services and high-quality applications from a shared pool of configurable computing resources. But the problem with this scenario is that the user no longer has the data in its possession. The data can be changed at any point. So this is a formidable risk. Users should be able to just use the cloud storage as if it is local, without worrying about the need to verify its integrity. In this part the Third Party Auditor comes into picture. TPA checks the integrity of outsourced data and user becomes worry-free.By utilizing public key based homomorphic authenticator with random masking privacy preserving public auditing can be achieved. We further extend our result to enable the TPA to perform audits for multiple users simultaneously and efficiently. Also the support for data dynamics can be achieved.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Comparison of Table Driven and Source Begin protocol of Mobile Infrastructure less Network: a review[ ]


Highly capable routing is an important issue for the design of wireless network protocols to meet the severe hardware and resource constraints. In the past few decades due to the colossal use of wireless network, important and challenging area of research is the field of Infrastructure less routing. The algorithm working in such environment needs to find the routing solution online. Various protocols has been anticipated for different type of network including ad-hoc, wireless. Routing protocol basically can be categorized in two provinces as proactive and another are reactive routing protocol. This paper provides the overall survey of the entire routing protocol. It started by giving detail description of problem domain. And then this paper provides the comparison of routing protocol for infrastructure less routing

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Email Based LAN Monitoring System[ ]


In a concern, computers are grouped together to form a network. To manage and control the activities of the network while in office is an easy task. But, while you are outstation away from office, how do you go about with monitoring and controlling of network? Instead of depending on third party information, you can always have your email accounts serve the purpose. The interaction between the clients and the remote administrator is achieved via a central monitoring server. The main objective of this paper is to provide maximum details about the network to the administrator on their email account, when administrator is away from office / goes out station. In the era of internet services, email are widely used and it has penetrated every part of our life, but remote monitoring of networks through email is still a mirage, this paper is an effort to make this mirage a reality, and this is where the genesis of this of this paper lies.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Cloud Computing : Service Level Agreements(SLA)[ ]


computing is a very complex system. It involves integration of many Information Communication Technologies, like Internet, Distributed Computing, Grids, Client- Server, etc. . . Main issue with any cloud is user’s trust. To achieve this user-trust many solutions are put forth by scientists one of which is service- usage agreement based cloud development. In this system cloud service provider and user makes some agreement regarding services of cloud with its usage; generally termed as Service Level Agreements(SLA). SLA usually documents the legal and technical issues related with usage of cloud by users. As developing and maintaining a cloud involves a good amount of finances, many time users find themselves in no condition to argue with cloud provider; or, to force it to bend its working as per user’s requirement. User, in such cases generally accepts a most suitable SLA package from the cloud service provider. So it’s necessary for user to what is SLA? What it should have to talk about? And, why it is difficult to impose users perspective on cloud service providers for SLA based Cloud? This paper explores some desired features of SLA with challenges in implementing a SLA based cloud to answer above questions upto some extent.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
HMM : An Efficient Credit Card Fraud Detection[ ]


Due to a rapid advancement in the electronic commerce technology, the use of credit cards has dramatically increased. As credit card becomes the most popular mode of payment for both online as well as regular purchase, cases of fraud associated with it are also rising.In this paper, we model the sequence of operations in credit card transaction processing using a Hidden Markov Model (HMM) and show how it can be used for the detection of frauds. An HMM is initially trained with the normal behavior of a cardholder. If an incoming credit card transaction is not accepted by the trained HMM with sufficiently high probability, it is considered to be fraudulent. At the same time we will alert to cardholder about Fraud via SMS and E-mail.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
A Survey paper on Cloud Computing and its effective utilization with Virtualization.[ ]


Cloud computing delivers IT capabilities as services-on-demand. As the number of existing cloud vendors rises, resource count and types are ever increasing leading to a need of cloud management solutions which facilitate easy cloud adoption. While providing several services, cloud management’s primary role is resource provisioning. In order to meet application needs in terms of resources, cloud developers must carefully choose among the existing offers in order to deploy their applications. This scalable and elastic model provides advantages like faster time-to-market, no capex and pay-per-use business model. While there are several such benefits, there are challenges in adopting public clouds because of dependency on infrastructure that is not completely controlled internally and rather shared with outsiders. Several enterprises, especially large ones that have already invested in their own infrastructure over the years are looking at setting up private clouds within their organizational boundaries to reap the benefits of cloud computing technologies leveraging such investments. Dynamic provisioning is a useful technique for handling the virtualized multi-tier applications in cloud environment. Understanding the performance of virtualized multi-tier applications is crucial for efficient cloud infrastructure management..

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Comparison of Energy Aware Load Balancing Algorithms In Cloud Computing.[ ]


Cloud Computing has widely been adopted by the industry or organization though there are many existing issues like Load Balancing, Virtual Machine Consolidation, Energy Management, etc. which have not been fully implemented. Central to these issues is the issue of load balancing, that is required to distribute the excess dynamic local workload equally to all the nodes in the whole Cloud to achieve a high user satisfaction. We propose DT-PALB (Double Threshold Power Aware Load Balancing) method to deploy the virtual machines for power saving purpose. DT-PALB is an extension from original PALB (Power Aware Load Balancing). Simulations and experiments have been conducted to verify our algorithms. The average power consumption is used as performance metrics and the result of PALB is used as baseline. Results show that DT-PALB we proposed can reduce the number of power-on physical machine and average power consumption compare to other deploy algorithms with power saving.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Enhancing the Security of a Cluster-based Wireless Sensor Network Using Hybrid Intrusion Detection System[ ]


Advances in wireless communication and miniature electronics have enabled the development of small, low-cost, low-power sensor nodes (SNs) with sensing, computation and communication capabilities. Therefore, the issues of Wireless Sensor Networks (WSNs) have become popular research subjects. WSN is infrastructure based network, and through the mass deployment of SNs, a WSN is formed. The major function of WSN is to collect and monitor the related information which about the specific environment. The SNs detect the surrounding environment or the given target and deliver the data to the sink using wireless communication. The data is then analyzed to find out the state of the target.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Networking and VANET Control For Data Aggregation [ ]


These In-network data aggregation is a useful technique to reduce redundant data and to improve communication efficiency however. They cannot be applied in highly mobile vehicular environments. We will try to implement an adaptive forwarding delay control scheme, namely Catch-Up, which dynamically changes the forwarding speed of nearby reports so that they have a better chance to meet each other and be aggregated together. The Catch-Up scheme to be designed will be based on a distributed learning algorithm where each vehicle learns from local observations and chooses a delay based on learning results. Traditional data aggregation schemes for wireless sensor networks usually rely on a fixed routing structure to ensure data can be aggregated at certain sensor nodes.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Performance on RFID Based Automatic[ ]


These RFID, radio frequency identification, is an automatic identification technology which identifies individual physical object according to unique IDs recorded by the RF Tags attached on this objects. In this research paper we examine RFID based toll deduction system and how to make more efficient and perfect. The vehicle will be equipped with a radio frequency (RF) tag which will detect RF Reader located in on toll plaza. The amount will then automatically deduct from the bank account. This research paper can be considered scalable to implement in motor vehicles used today.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
SAR images using IP and computer vision[ ]


The cost of processing was fairly high, however, with the computing equipment of that era. That changed in the 1970s, when digital image processing proliferated as cheaper computers and dedicated hardware became available. Images then could be processed in real time, for some dedicated problems such as television standards conversion.Many of the techniques of digital image processing, or digital picture processing as it often was called, were developed and a few other research facilities, with application to satellite imagery, wire-photo standards conversion, medical imaging, videophone, character recognition, and photograph enhancement.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Software engineering[ ]


Software engineering (SE) is the application of a systematic, disciplined, quantifiable approach to the design, development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software.[1][2][3] In layman's terms, it is the act of using insights to conceive, model and scale a solution to a problem. The first reference to the term is the 1968 NATO Software Engineering Conference and was meant to provoke thought regarding the perceived "software crisis" at the time.[4][5][6]Software development, a much used and more generic term, does not necessarily subsume the engineering paradigm. The generally accepted concepts of Software Engineering as an engineering discipline have been specified in the Guide to the Software Engineering Body of Knowledge (SWEBOK). The SWEBOK has become an internationally accepted standard ISO/IEC TR 19759:2005.[7].

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Implementation of Multilevel Threshold[ ]


These The digital image processing has been applied in several areas, especially where it is necessary to use tools forfeature extraction and to get patterns of the studied images. In an initial stage, the segmentation is used to separate the image in parts that represents a interest object, that may be used in a specific study. There are several methods thatintends to perform such task, but it is difficult to find a method that can easily adapt to different typeof images, that often are very complex or specific. To resolve this problem, this workaims to presents anadaptable segmentation method, that can be applied to different typeof images, providing a better segmentation. The proposed method is based ona model of automatic multilevel thresholding and considers techniques of group histogram quantization, analysis of the histogram slope percentage and calculation of maximum entropy to define the threshold.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Detecting and preventing vampire attack in wireless sensor network[ ]


Vampire attacks are not specific to any specific protocol, but rather rely on the properties of many popular classes of routing protocols. A single Vampire can increase network-wide energy usage by a factor of O(N), where N in the number of network nodes. This paper will use two attack on stateless protocol in which first Carousel attack is an adversary sends a packet with a route composed as a series of loops, such that the same node appears in the route many times. Second Stretch attack where a malicious node constructs artificially long source routes, causing packets to traverse a larger than optimal number of nodes. The vampire attack are very difficult to detect and more over very difficult to prevent..

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Optimal Number of Shares For Digital Watermarking Scheme Using Visual Cryptography[ ]


Distribution channels such as digital music downloads, video-on-demand, and multimedia social networks pose new challenges to the design of content protection measures aimed at preventing copyright violations. Digital watermarking has been proposed as a possible brick of such protection systems. However, application of watermarking for multimedia content protection in realistic scenarios poses several security issues.Secure signal processing, by which name we indicate a set of techniques able to process sensitive signals that have been obfuscated either by encryption or by other privacy-preserving primitives, may offer valuable solutions to the aforementioned issues. More specifically, the adoption of efficient methods for watermark embedding or detection on data that have been secured in some way, which we name in short secure watermarking, provides an elegant way to solve the security concerns of applications. A digital watermarking technique is used to generate meaningful shares. The secret image shares are watermarked with different cover images and are transmitted. At the receiving side the cover images are extracted from the shares and stacked one by one which reveals the secret image progressively. Digital watermarking using visual cryptography provides improved security for encrypting secret images.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Study On Low Maintenance Wireless Network Architecture In Implementing Wireless Networks In Dense Forests and Rural area[ ]


An innovative ,cost effective and ecofriendly approach for the implementation of wide area wireless network coverage for surveillance, remote wireless sensing , monitoring and other usage in dense forest and rural areas in developing countries like India .The paper gives an simplified layout for the network implementation for bridging the digital divide and the knowledge flow between the urban areas and the remote locations where telecom companies cannot reach easily or deploy network as it is a costly and difficult affair for implementation . The proposed paper shows a way to overcome challenges for implementation of the cellular and wireless network and making the technology accessible to the rural areas to enhance their growth and economic.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Estimating Parameters of a Three-Phase induction motor using Matlab/Simulink[ ]


Steady-state analysis of a three phase induction motor can be carried out by using computer simulation software such as MATLAB. From Simulink/Power system block set (PSB) library, the induction motor model can be effectively used to simulate the effects of DC resistance, no-load and blocked rotor tests. These tests are performed on a machine to estimate the values of stator and rotor resistances and leakage reactance's. Since the skin effect and temperature are not taken into consideration with DC resistance modeling but still the results derived are approximate and reasonably good.

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------