DBpedia: a nucleus for a web of open data, By S. Auer, C. Bizer, G. Kobilarov, J. Lehmann, R. Cyganiak and Z. Ives, The 6th International Semantic Web Conference (ISWC 2007) Busan, Korea, November 2007, in LNCS 4825, pp.722-735
The last 30 years have seen a number of attempts by computer scientists with an interest in information integration research and proceeded alongside efforts in Semantic Web with associated technology developments. However, the current Web is still challenged by these tasks. Auer et al. (2007) in this article attempt to integrate information from across various web systems and make Wikipedia information a machine-readable representation both in structural formats and semantic data sets.
The authors provide a relatively comprehensive overview of existing problems and challenges such as:
(1) Web information has not been fully accessible to a general audience
(2) inconsistency, ambiguity, uncertainty, and data provenance of grass-roots data
(3) the need of using collaborative sharing of dynamic data approaches to build the Semantic Web in grass-roots-style
(4) the need of a new model of structured information representation and management
Extending concepts and approaches from the W3C Linking Open Data community project and extract structured information from Wikipedia, the authors argue in favor of a triple model of Resource Description Framework (RDF) that provides a flexible data model for representing and publishing information on the Web. RDF is a basic foundation to give one or more types to a resource set in triples: (subject, predicate, object) or (subject, property, property value) . RDF triples extracted from data sets, in this DBpedia model, are basic components that can be shared, exchanged, and processed queries in a variety of Semantic Web applications.
Several of the most valuable datasets including articles described with concepts, Infoboxes (data attributed for concepts), categories or article categories using SKOS, Yago Types (instances using YAGO classification), internal page links, as well as RDF links, are provided for download as a set of RDF files which are identified by their own URI reference.
2008-12-22
2008-12-17
ArticleRead (13): User Experience at Google: focus on the user and all else will follow
User Experience at Google: focus on the user and all else will follow. by Au, I., et al. (2008) In CHI 2008 Proceedings Extended Abstracts, ACM Press (2008), pp 3681-3686
Which research approaches should ensure that user experiences are interpreted to reflect underlined norms of online users, and promise a better identification for designers to predict user behaviors in the system design process? The case of Google in this article demonstrates a multi-method of user experience based on its corporate philosophy: “Follow the user and all else will follow”.
On the one hand, Google have traditionally sought to adopt their data-driven approach by applying web analytics of quantitative investigation in reflecting what is happening. On the other hand, built on the qualitative approach, Google interpret contextual factors of why users interact with the system designs via field research, diary studies, face-to-face interview. Such an approach is applied by the Google user experience (UX) team in exploring user behavior of Google Maps for the mobile application. They follow a method called Mediated Data Collection approach, in which participants and mobile technologies are assumed to mediate data collection about use in natural settings. Therefore, methods such as prior research on log analysis, recorded usage, focus group study, or field trial, telephone interviews, lab debriefs are combined to utilize the investigations on user behaviour.
This article stresses the bottom-up company culture as a key for designers and project managers to understand the essence of user experience. Three techniques are employed by: (1) injecting the corporate DNA to educate and train engineers and PMs about user experience (i.e. the ‘Life of a User’ training program and ‘Field Fridays’) (2) scaling to support hundreds of projects by UX team (3) helping focus projects on user needs by UX team or user research knowledge base.
Unlike the traditional desktop software design updated on annual basis, Google UX team practices some agile techniques to respond the rapid web cycles. For examples, solutions include guerilla usability testing, prototyping on the fly, online experimentation or enabling a live instant messaging dialogue between observers and moderator during lab-based testing.
The above three approaches are also combined with a global product perspective of designing for multiple countries. In sum, these 4 combinations of the Google case provide us an alternate analytic framework, and best enlist the methodologies for the studying of online user experience practically and implicitly.
Which research approaches should ensure that user experiences are interpreted to reflect underlined norms of online users, and promise a better identification for designers to predict user behaviors in the system design process? The case of Google in this article demonstrates a multi-method of user experience based on its corporate philosophy: “Follow the user and all else will follow”.
On the one hand, Google have traditionally sought to adopt their data-driven approach by applying web analytics of quantitative investigation in reflecting what is happening. On the other hand, built on the qualitative approach, Google interpret contextual factors of why users interact with the system designs via field research, diary studies, face-to-face interview. Such an approach is applied by the Google user experience (UX) team in exploring user behavior of Google Maps for the mobile application. They follow a method called Mediated Data Collection approach, in which participants and mobile technologies are assumed to mediate data collection about use in natural settings. Therefore, methods such as prior research on log analysis, recorded usage, focus group study, or field trial, telephone interviews, lab debriefs are combined to utilize the investigations on user behaviour.
This article stresses the bottom-up company culture as a key for designers and project managers to understand the essence of user experience. Three techniques are employed by: (1) injecting the corporate DNA to educate and train engineers and PMs about user experience (i.e. the ‘Life of a User’ training program and ‘Field Fridays’) (2) scaling to support hundreds of projects by UX team (3) helping focus projects on user needs by UX team or user research knowledge base.
Unlike the traditional desktop software design updated on annual basis, Google UX team practices some agile techniques to respond the rapid web cycles. For examples, solutions include guerilla usability testing, prototyping on the fly, online experimentation or enabling a live instant messaging dialogue between observers and moderator during lab-based testing.
The above three approaches are also combined with a global product perspective of designing for multiple countries. In sum, these 4 combinations of the Google case provide us an alternate analytic framework, and best enlist the methodologies for the studying of online user experience practically and implicitly.
2008-12-09
ArticleRead (12): The credibility of volunteered geographic information
The credibility of volunteered geographic information, By Andrew J. Flanagin and Miriam J. Metzger, in GeoJournal (2008) 72:137–148
The study of Flanagin and Metzger (2008) was to exam the issues of information and source credibility in the context of volunteered geographic environment (VGI).
VGI with its similar concepts such as GIS/2, neogeography, or ‘‘geography without geographers’’ has been regarded as an extension of public participation geographic information systems (PPGIS); collaborative GIS; participatory GIS; Community Integrated GIS (CIGIS) to the general public. While the advance of social computing has parallel effects on the production and availability of user-generated geo data, the need to re-conceptualize the traditional definitions of information and source credibility has been proposed here.
The credibility of VGI is strongly suggestive based on two concepts from Goodchild(2007)'s "humans as sensors" as well as the perspective of social science which the credibility is "a subjective perception on the part of the information receiver". In contrast, the credibility of VGI taken from the notion of "credibility-as-perception" is functioned as the relatively objective properties of information, rather than "a subjective perception" while compared with the traditional geo information formed by a few individual authority perceptions.
The overall recommendations for the credibility judgments of VGI are listed eight points in the figure below. Research directions such as: on the user motivations; "credibility transfer" phenomena (geo data has been perceived more objective than other forms of user-generated data); market implications; measurement issues (e.g. the provenance of VGI); or the effects of VGI on the social, educational, and political contexts are suggested.
Subscribe to:
Posts (Atom)