Today, recommender systems are widely used in various domains. There are a lot of methods to generate recommendations and numerous parameters to adjust these methods. Each one of them has its individual strengths and weaknesses in certain situations. Based on the real-world use case of an existing location-based service application, the research at hand proves that by employing a custom-made hybrid recommender system, it is possible to exploit these strengths and, at the same time, to limit the weaknesses. The project analyses some of the most popular recommendation algorithms with respect to their predictive accuracy on datasets with different characteristics. For this purpose, a suitable evaluation method was designed and implemented in the form of an experimental setup and protocol. Six runtime factors characterizing each dataset are identified and investigated. The results show that these runtime factors have a direct influence on the quality of the recommendations and that they affect different recommendation algorithms in different, sometimes oppositional ways. These results are used to explore whether the predictive accuracy of a superordinate hybrid recommender system in a dynamic environment can be improved, when compared with each single subordinate recommendation algorithm. This is achieved by dynamically selecting or weighting the results of the respective sub-algorithms in consideration of the current situation of use. For this purpose, two different hybrid recommendation algorithms were developed and analysed in direct comparison with the conventional algorithms. It is demonstrated that one of them achieves the most accurate results over the whole range of runtime factor values under investigation, thus effectively eluding the limitations of the specific sub-algorithms it utilizes. The second hybrid algorithm achieves relatively good results, but falls short of the author’s expectations to outperform all other methods including the first hybrid.
Kevin Matz. Designing and evaluating an intention-based comment enforcement scheme for Java. http://computing-reports.open.ac.uk/2010/TR2010-22.pdf. 2010. M801 MSC Dissertation, 2010/22,
Software maintenance forms a significant portion of the cost of large-scale software projects. A time-consuming part of maintenance is program comprehension – reading legacy code to understand how and where to make changes. The process of understanding existing code involves reconstructing the design intentions and rationale of the original developers. This dissertation argues that explicitly recording intention and rationale information during design and construction eases program comprehension during maintenance. This dissertation conducts a survey of practicing software developers to understand difficulties in software maintenance and opinions on software documentation. The results and a literature survey are then used to argue that significant problems exist which can best be dealt with by designing a new technology-based solution. By reviewing the program comprehension literature and examining past solutions, requirements are formulated for an “ideal” solution for recording intention and rationale documentation. A partial solution, Design Intention Driven Programming, is proposed, which encourages developers to record design intentions before writing code. The process is supported by a language, Java with Intentions, which adds intention comments, first-class documentation constructs, to the Java language. The compiler flags as errors any artefacts (e.g., classes) not described by intention comments and uses complexity metrics to detect “empty” comments. A rudimentary prototype of a precompiler for the language and a sample application are constructed as proofs of concept. The solution is evaluated using several analyses and by surveying developers for feedback on its practicality. Respondents’ opinions are divided on the solution’s feasibility and utility. Numerous problematic issues are identified, including resistance of developers to write documentation, limitations of the documentation enforcement mechanism, and the lack of concrete evidence of long-term cost savings. The evaluation suggests that, while the approach may be promising for some projects and teams, its unpopularity with most developers renders it impractical for typical commercial projects.
C Hughes. Sound Spheres A non-contact virtual musical instrument played using finger tracking. http://computing-reports.open.ac.uk/2010/TR2010-23.pdf. 2010. M801 MSC Dissertation, 2010/23,
The creation and performance of music is predominantly and traditionally reliant on the direct physical interaction between the performer and a musical instrument. The advent of electronics and computing has given rise to many new electronic musical instruments and interfaces. Recent advances in these areas have seen an emerging trend into the design of virtual musical interfaces in which audio is synthesized and played back based on a musician‟s body movements captured by some gestural interface. Designing new electronic or virtual musical instruments necessitates consideration of many factors that affect its control and playability. The research described in this dissertation concerns the design and construction of a new non-contact virtual musical instrument (called Sound Spheres) that uses a finger tracking method as its gestural interface. The dissertation identifies control parameters and key factors that are considered important for the design of such instruments and provides research into whether these can be successfully achieved in a non-contact virtual musical instrument played by finger tracking. Results show that implementation of the control parameters of pressure, speed, and position can successfully be achieved for a non-contact virtual musical instrument. Achieving successful implementation of the angle control parameter however was inconclusive. Furthermore the results present evidence that the finger tracking technique is an effective method for playing such an instrument.
M Beatty. A Step Towards Reader Acceptance of Hypertext Fiction: From Annoying Distraction to Enjoyable Experience.. http://computing-reports.open.ac.uk/2010/TR2010-24.pdf. 2010. M801 MSC Dissertation, 2010/24,
The first hypertext novel (Douglas, 2000), afternoon, a story (see bibliography, Joyce 1990) was published in 1990. Despite its existence for over 20 years, hypertext fiction is little known amongst “ordinary” readers of fiction and has failed to achieve the popularity and mainstream audience envisaged by early theorists. The domain itself is highly theorised but there is little, if any, empirical evidence available to back up the sometimes grand claims of theorists. In fact, research conducted to date highlights the frustration and disorientation readers experience and attempts to present conventions and guidelines that authors and designers should follow in order to improve the experience (Pope 2006, Gee 2001). This research aimed to investigate hypertext fiction from the reader’s perspective in an effort to identify features that hinder or foster enjoyment. Readers were presented with a variety of hypertext fictions and asked to join an online discussion group to present their views and opinions. The qualitative data gathered was analysed to identify important themes raised by the participants. Subsequently, more data was gathered from a questionnaire, designed in relation to the qualitative data, in an attempt to corroborate the initial analysis. It is interesting that while the qualitative data was largely negative, the questionnaire results were less so. It was found that readers are not averse to hypertext fiction and the majority of participants would choose to read it again in the future, although they would opt for a text-only work. Although the multimedia and gaming elements contained in the hypertext fictions in this research were not considered particularly enjoyable participants saw the potential. Participants would be willing to interact with hypertext fiction through such features if they were intuitive to use, added something to the story and were seamlessly integrated. Participants want to control hypertext fiction, particularly with regard to pace of reading, length of time spent reading and interactive elements (such as multimedia and gaming). Indeed, it was found that participants want much more control over the experience than the works in this study allowed them. Due to the subjective nature of reading, the small number of participants and the limited number of hypertext fictions presented in this study, it is not considered possible to generalise the results. However it seems clear that authors would be able to attain a wider audience for their work as long as they consider the reader with regard to interface, design, interactions, writing style and plot. Future research could be conducted using younger participants, a different selection of participants and a wider range of hypertext fictions.
M Clarke. IT Governance Design: An Application of Problem Oriented Engineering to Enterprise Architecture, TOGAF and SOA Development. http://computing-reports.open.ac.uk/2010/TR2010-25.pdf. 2010. M801 MSC Dissertation, 2010/25,
This dissertation investigates the discipline of Enterprise Architecture in two ways. Firstly, as a fundamental part of IT Governance and, secondly, regarding its use for effective management and co-ordination of an organisation and the deployment of its IT solutions and applications. Enterprise Architecture should help enable an organisation to achieve its strategic goals. TOGAF is a leading framework, which provides a selection of tools and best-practice methodologies for Enterprise Architecture practitioners. It is a relatively new development, especially version 9 (the latest release) and there are few examples in the literature of studies into its successful and effective application in pragmatic organisational practice. TOGAF has its roots in a technical approach, although, in line with IT Governance precepts, in its latest releases it urges close and wide stakeholder involvement in Enterprise Architecture initiatives. Service Orientated Architecture (SOA) embodies and extends many of the principles of best practice in software engineering to provide an approach which can better address future business requirements in a flexible and more cost-effective way. To successfully implement SOA, a holistic view of an organisation‟s IT architecture is essential and this is informed by the close involvement of the wider stakeholder community that is a fundamental principle of Enterprise Architecture and TOGAF. This dissertation investigates stakeholder engagement in Enterprise Architecture initiatives through a case study. It examines in particular the success of applying TOGAF as an IT Governance framework in terms of involving stakeholders in the early design stages of an SOA development. The case study is based on the author‟s own experience applying TOGAF in a UK FTSE-100 company. A survey of UK IT end-user organisations was also carried out to determine whether the results of the case study investigation could be generalised across the UK Enterprise Architecture practitioner community. Problem Oriented Engineering (POE), an emerging theoretical framework for design, was successfully used as a research methodology. It was found to be suitable for the analysis due to its emphasis on stakeholder involvement in problem and solution exploration and validation and exploration of the consequent risks.
P Taylor. A study into the usability of the Formal Systems Model to investigate the Critical Success Factors that have been accepted for the management of an IT project. http://computing-reports.open.ac.uk/2010/TR2010-26.pdf. 2010. M801 MSC Dissertation, 2010/26,
Critical Success Factors (CSF) are viewed as the key areas within a project where things must go right and that must receive constant and careful attention from management for a project to be a success. The approach is not without its problems. Firstly the inter-relationships between CSFs are at least as important as the individual factors, but the CSF approach does not provide a mechanism for taking account of these relationships. Secondly the factor approach tends to view implementations as a static process instead of a dynamic phenomenon. It ignores the potential for a factor to have varying levels of importance at different stages of the implementation process. Studies to date have focused on ways of overcoming these difficulties found with the CSF approach. This study investigates the use of two adaptations of the Formal Systems Model (FSM), a model which is claimed to be able to overcome the difficulties found within the factors approach, as well as being able to distinguish between successful and unsuccessful projects. This research was conducted by adopting the Systems Failures Approach (SFA) as a guide to study the data gathered from two UK based implementations of computerised systems. The data was fed into the FSM adaptations to consider the usability of the project-specific FSM in comparison to the FSM when used as a framework for investigating factors critical to success in implemented IT (Information Technology) and IS (Information Systems) projects. The results of this research show that the project-specific FSM is capable of distinguishing between projects perceived as successful and unsuccessful. It can highlight factors practitioners may perceive as critical to success, and also be used as a framework for investigating CSFs like the FSM. Whilst the results reveal positive characteristics about the project-specific FSM, such as being more focussed in its approach through having the failings associated with projects mapped directly onto its components. Its biggest challenge must be to overcome the reluctance of Project Managers and Practitioners to use methods and techniques in the management of their projects. Without this increase in use, White and Fortunes claim that the project-specific FSM is more accessible to Practitioners and Project Managers than the FSM cannot fully be substantiated.