Digital Library
Vol. 6, No. 4, Dec. 2010
-
Zhi-Hui Wang, Chin-Chen Chang, Pei-Yu Tsai
Vol. 6, No. 4, pp. 435-452, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.435
Keywords: Data Hiding, Steganography, Vector Quantization
Show / Hide AbstractThis paper proposes a novel reversible data hiding scheme based on a Vector Quantization (VQ) codebook. The proposed scheme uses the principle component analysis (PCA) algorithm to sort the codebook and to find two similar codewords of an image block. According to the secret to be embedded and the difference between those two similar codewords, the original image block is transformed into a difference number table. Finally, this table is compressed by entropy coding and sent to the receiver. The experimental results demonstrate that the proposed scheme can achieve greater hiding capacity, about five bits per index, with an acceptable bit rate. At the receiver end, after the compressed code has been decoded, the image can be recovered to a VQ compressed image. -
Elena Andreeva, Bart Mennink, Bart Preneel
Vol. 6, No. 4, pp. 453-480, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.453
Keywords: Hash Functions, Domain Extenders, Security Properties
Show / Hide AbstractCryptographic hash functions reduce inputs of arbitrary or very large length to a short string of fixed length. All hash function designs start from a compression function with fixed length inputs. The compression function itself is designed from scratch, or derived from a block cipher or a permutation. The most common procedure to extend the domain of a compression function in order to obtain a hash function is a simple linear iteration; however, some variants use multiple iterations or a tree structure that allows for parallelism. This paper presents a survey of 17 extenders in the literature. It considers the natural question whether these preserve the security properties of the compression function, and more in particular collision resistance, second preimage resistance, preimage resistance and the pseudo-random oracle property. -
Aly M. El-Semary, Mostafa Gadal-Haqq M. Mostafa
Vol. 6, No. 4, pp. 481-500, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.481
Keywords: data-mining, Fuzzy Logic, IDS, Intelligent Techniques, network security, Software Agents
Show / Hide AbstractThe Internet explosion and the increase in crucial web applications such as ebanking and e-commerce, make essential the need for network security tools. One of such tools is an Intrusion detection system which can be classified based on detection approachs as being signature-based or anomaly-based. Even though intrusion detection systems are well defined, their cooperation with each other to detect attacks needs to be addressed. Consequently, a new architecture that allows them to cooperate in detecting attacks is proposed. The architecture uses Software Agents to provide scalability and distributability. It works in two modes: learning and detection. During learning mode, it generates a profile for each individual system using a fuzzy data mining algorithm. During detection mode, each system uses the FuzzyJess to match network traffic against its profile. The architecture was tested against a standard data set produced by MIT Lincoln Laboratory and the primary results show its efficiency and capability to detect attacks. Finally, two new methods, the memory-window and memoryless-window, were developed for extracting useful parameters from raw packets. The parameters are used as detection metrics -
Dae-Suk Yoo, Seung Sik Choi
Vol. 6, No. 4, pp. 501-510, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.501
Keywords: sensor networks, Energy-Efficient MAC, S-MAC
Show / Hide AbstractWireless sensor networks consist of sensor nodes which are expected to be battery-powered and are hard to replace or recharge. Thus, reducing the energy consumption of sensor nodes is an important design consideration in wireless sensor networks. For the implementation of an energy-efficient MAC protocol, a Sensor-MAC based on the IEEE 802.11 protocol, which has energy efficient scheduling, has been proposed. In this paper, we propose a Dynamic S-MAC that adapts dynamically to the network-traffic state. The dynamic S-MAC protocol improves the energy consumption of the S-MAC by changing the frame length according to the network-traffic state. Using an NS-2 Simulator, we compare the performance of the Dynamic S-MAC with that of the SMAC protocol. -
Md. Imdadul Islam, Nasima Begum, Mahbubul Alam, M. R. Amin
Vol. 6, No. 4, pp. 511-520, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.511
Keywords: Canny Filter, Color Inversion, Skewness, Kurtosis and Convolution
Show / Hide AbstractThis paper proposes two new methods to detect the fingerprints of different persons based on one-dimensional and two-dimensional discrete wavelet transformations (DWTs). Recent literature shows that fingerprint detection based on DWT requires less memory space compared to pattern recognition and moment-based image recognition techniques. In this study four statistical parameters - cross correlation co-efficient, skewness, kurtosis and convolution of the approximate coefficient of one-dimensional DWTs are used to evaluate the two methods involving fingerprints of the same person and those of different persons. Within the contexts of all statistical parameters in detection of fingerprints, our second method shows better results than that of the first method. -
Juyoung Kang, Hwan-Seung Yong
Vol. 6, No. 4, pp. 521-536, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.521
Keywords: Data Mining, Spatio-Temporal Data Mining, Trajectory Data, Frequent Spatio-Temporal Patterns
Show / Hide AbstractSpatio-temporal patterns extracted from historical trajectories of moving objects reveal important knowledge about movement behavior for high quality LBS services. Existing approaches transform trajectories into sequences of location symbols and derive frequent subsequences by applying conventional sequential pattern mining algorithms. However, spatio-temporal correlations may be lost due to the inappropriate approximations of spatial and temporal properties. In this paper, we address the problem of mining spatio-temporal patterns from trajectory data. The inefficient description of temporal information decreases the mining efficiency and the interpretability of the patterns. We provide a formal statement of efficient representation of spatio-temporal movements and propose a new approach to discover spatio-temporal patterns in trajectory data. The proposed method first finds meaningful spatio-temporal regions and extracts frequent spatio-temporal patterns based on a prefix-projection approach from the sequences of these regions. We experimentally analyze that the proposed method improves mining performance and derives more intuitive patterns. -
Maneesha Srivasatav, Yogesh Singh, Durg Singh Chauhan
Vol. 6, No. 4, pp. 537-552, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.537
Keywords: clustering, Debugging, Fault Localization, Optimization, Software Testing
Show / Hide AbstractSoftware Debugging is the most time consuming and costly process in the software development process. Many techniques have been proposed to isolate different faults in a program thereby creating separate sets of failing program statements. Debugging in parallel is a technique which proposes distribution of a single faulty program segment into many fault focused program slices to be debugged simultaneously by multiple debuggers. In this paper we propose a new technique called Faulty Slice Distribution (FSD) to make parallel debugging more efficient by measuring the time and labor associated with a slice. Using this measure we then distribute these faulty slices evenly among debuggers. For this we propose an algorithm that estimates an optimized group of faulty slices using as a parameter the priority assigned to each slice as computed by value of their complexity. This helps in the efficient merging of two or more slices for distribution among debuggers so that debugging can be performed in parallel. To validate the effectiveness of this proposed technique we explain the process using example. -
Sung-Jin Baek, Sun-Mi Park, Su-Hyun Yang, Eun-Ha Song, Young-Sik Jeong
Vol. 6, No. 4, pp. 553-562, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.553
Keywords: Server Virtualization, Grid Service, Grid Infrastructure, Power Efficiency, Cloud Computing
Show / Hide AbstractThe core services in cloud computing environment are SaaS (Software as a Service), Paas (Platform as a Service) and IaaS (Infrastructure as a Service). Among these three core services server virtualization belongs to IaaS and is a service technology to reduce the server maintenance expenses. Normally, the primary purpose of sever virtualization is building and maintaining a new well functioning server rather than using several existing servers, and in improving the various system performances. Often times this presents an issue in that there might be a need to increase expenses in order to build a new server. This study intends to use grid service architecture for a form of server virtualization which utilizes the existing servers rather than introducing a new server. More specifically, the proposed system is to enhance system performance and to reduce the corresponding expenses, by adopting a scheduling algorithm among the distributed servers and the constituents for grid computing thereby supporting the server virtualization service. Furthermore, the proposed server virtualization system will minimize power management by adopting the sleep severs, the subsidized servers and the grid infrastructure. The power maintenance expenses for the sleep servers will be lowered by utilizing the ACPI (Advanced Configuration & Power Interface) standards with the purpose of overcoming the limits of server performance. -
Ahmed Salem
Vol. 6, No. 4, pp. 563-574, Dec. 2010
https://doi.org/ 10.3745/JIPS.2010.6.4.563
Keywords: Injection Flaws, SQL Injection, Intercepting Filter, Cross-site Scripting Vulnerability
Show / Hide AbstractThe growing number of web applications in the global economy has made it critically important to develop secure and reliable software to support the economy's increasing dependence on web-based systems. We propose an intercepting filter approach to mitigate the risk of injection flaw exploitation- one of the most dangerous methods of attacking web applications. The proposed approach can be implemented in Java or .NET environments following the intercepting filter design pattern. This paper provides examples to illustrate the proposed approach. -
Hidayat Febiansyah, Jin Baek Kwon
Vol. 6, No. 4, pp. 575-596, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.575
Keywords: Proxy-Assisted, Periodic Broadcasting, Video-on-demand
Show / Hide AbstractVideo-on-Demand services are increasing rapidly nowadays. The load on servers can be very high, even exceeding their capacity. For popular contents, we can use a Periodic Broadcast (PB) strategy using multicast to serve all clients. Recent development of PB uses multiple channels broadcasting for segments of movies in certain patterns, so that users only need to wait for a small segment to start the service. However, users need higher download capacity to download multiple segments at a time. In order to compensate for this, a proxy server can help to reduce download bandwidth requirements by holding some segments for a certain time. This research will focus on more recent PB schemes that couldn't be covered by previous Proxy-Assisted Periodic Broadcast strategies. -
Chetna Gupta, Yogesh Singh, Durg Singh Chauhan
Vol. 6, No. 4, pp. 597-608, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.597
Keywords: Change Impact Analysis, Regression Testing, Software Maintenance, Software Testing
Show / Hide AbstractSoftware evolution is an ongoing process carried out with the aim of extending base applications either for adding new functionalities or for adapting software to changing environments. This brings about the need for estimating and determining the overall impact of changes to a software system. In the last few decades many such change/impact analysis techniques have been developed to identify consequences of making changes to software systems. In this paper we propose a new approach of estimating change/impact analysis by classifying change based on type of change classification e.g. (a) nature and (b) extent of change propagation. The impact set produced consists of two dimensions of information: (a) statements affected by change propagation and (b) percentage i.e. statements affected in each category and involving the overall system. We also propose an algorithm for classifying the type of change. To establish confidence in effectiveness and efficiency we illustrate this technique with the help of an example. Results of our analysis are promising towards achieving the aim of the proposed endeavor to enhance change classification. The proposed dynamic technique for estimating impact sets and their percentage of impact will help software maintainers in performing selective regression testing by analyzing impact sets regarding the nature of change and change dependency. -
Hyung-Min Lim, Kun-Won Jang, Byung-Gi Kim
Vol. 6, No. 4, pp. 609-620, Dec. 2010
https://doi.org/10.3745/JIPS.2010.6.4.609
Keywords: u-learning, E-learning, Event Hooking, Content packing
Show / Hide AbstractIn order to provide a tailored education for learners within the ubiquitous environment, it is critical to undertake an analysis of the learning activities of learners. For this purpose, SCORM (Sharable Contents Object Reference Model), IMS LD (Instructional Management System Learning Design) and other standards provide learning design support functions, such as, progress checks. However, in order to apply these types of standards, contents packaging is required, and due to the complicated standard dimensions, the facilitation level is lower than the work volume when developing the contents and this requires additional work when revision becomes necessary. In addition, since the learning results are managed by the server there is the problem of the OS being unable to save data when the network is cut off. In this study, a system is realized to manage the actions of learners through the event interception of a web-browser by using event hooking. Through this technique, all HTMLbased contents can be facilitated again without additional work and saving and analysis of learning results are available to improve the problems following the application of standards. Furthermore, the ubiquitous learning environment can be supported by tracking down learning results when the network is cut off.