Introduction

With the advent of the internet, consumers are increasingly turning to computer-mediated communication (CMC), such as online communities, to get information to use in their decision-making processes.1 Steffes and Burgee2 suggest that the rise in CMC has made way for an internet-facilitated variation of traditional word of mouth, referred to in research as electronic word of mouth (eWOM).

As online communities grow, so does the need for markers to understand the influences and processes that are taking place within eWOM. Research has already established the power of eWOM in influencing the sales of movies, hotels, restaurants and other products and services.3, 4, 5, 6, 7 Other scholars have investigated the volume and dispersion of eWOM,3 the motivations of consumers for posting reviews on consumer opinion platforms,8 online feedback mechanisms,9 and the determinants of online reviews adoption.10 However, less interest has been directed towards the determinants of information quality and source credibility of eWOM in online communities.10

The analysis of consumer perceptions of the source of communication and the quality of the message provided in it is paramount in online environments where people use information and source cues to evaluate the reliability of the information provided by anonymous sources. The aim of this paper is to gain a further understanding of the key variables of information quality and perceived source credibility of eWOM in the context of online customer support communities. By understanding how consumers perceive the information and the source of such information in online communities, marketers can identify the factors that need to be addressed in order to improve the quality of the information provided in such communities and, as a result, the overall quality of online customer service. This goal is achieved by examining how consumers perceive the source of communications and the quality of information by using netnographic methods in a popular online customer service community, namely, the Dell online community.

Background and literature review

Westbrook11 defines word of mouth (WOM) as informal communications directed at other consumers about the ownership, usage or characteristics of particular goods and services and/or their sellers. Bickart and Schindler12 highlight how WOM communication typically involves one friend or relative in a face-to-face situation sharing product information with another via the spoken word.

As the internet environment has evolved, WOM research has expanded to the online context, giving rise to eWOM.1, 2, 4, 13, 14, 15 Through electronic media, such as online discussion forums, electronic bulletin-board systems, chatrooms, newsgroups, blogs, review sites and social networking sites, the web has created opportunities for eWOM communication.10, 14, 15, 16 Hennig-Thurau et al. (p. 39)15 define eWOM communication as ‘any positive or negative statement made by potential, actual, or former customers about a product or company, which is made available to a multitude of people and institutions via the internet’.

Compared to word-of-mouth, ‘eWOM harnesses the bi-directional communication properties and unlimited reach of the internet to share opinions and experiences on a one-to-world platform rather than a one-to-one platform’ (p. 9).13 A unique feature of eWOM compared to word of mouth is its unprecedented speed of transmission over the internet4 and the ability within online communities to refer back to online discussions compared to traditional word of mouth, with both synchronous and asynchronous discussions enabled.12 It has been suggested that eWOM is more influential than WOM due to its speed, convenience, one-to-many reach and absence of face-to-face human pressure.14 However, contradicting this, Cheung et al.13 argue that consumers deliberate on the credibility of eWOM to a greater extent than traditional word-of-mouth, only accepting online advice that they perceive as credible.

Compared to WOM — which is an immediate, intimate conversation — eWOM is most frequently an asynchronous process whereby the sender and receiver of information are separated by both space and time.2 Furthermore, eWOM eliminates the restrictions on time and location, with the asynchronous discussions usually being kept for some time to allow other users to participate or read the messages at their own pace.13 Previous studies have contributed to the understanding of eWOM by examining online consumer discussion forums,1, 2, 13, 17 particularly to study the volume and dispersion of eWOM,3 identifying what motivates consumers to articulate themselves on consumer opinion platforms,15 researching online feedback mechanisms,9 investigating online review sites and the word-of-mouth effect in the presence of positive feedback mechanisms.4

Online word of mouth has been acknowledged as a critical tool for facilitating information diffusion throughout online communities.14 Access to the internet, and therefore access to online communities, has grown substantially, with 70 per cent of UK adults accessing a high-speed broadband connection.18 Consumers join online communities to ask for advice and information and for assistance in decision making, prior to purchasing.19 Sun et al. (p. 1106)14 define online communities as ‘online social entities that are maintained by individuals to exchange shared interests or values with current and potential community members in an ongoing manner without physical interaction’.

Marketing researchers have recently started to investigate online communities.19, 20, 21 The types of online communities that have been researched include communities of interest (eg pogo.com), communities of relationship (eg mumsnet.com), communities of transaction (eg amazon.com) and communities of practice (eg Google Docs).22 This study focuses on consumer communities of practice. A community of practice is ‘groups of people who share a concern, a set of problems or a passion about a topic, and who deepen their knowledge and expertise in this area by interacting on an ongoing basis’(p. 4).23

Compared to face-to-face communicators, consumers involved in online communities have been found to demonstrate fewer inhibitions, display less social anxiety and exhibit less public self-awareness, and tend to be more honest and forthcoming with their viewpoints, which could be due to the anonymity offered by the internet.14 Online communities provide a wide range of opinions and recommendations regarding companies and products, with consumers’ preferences forming within communities as they exchange opinions about the products and services while observing one another’s purchases.24

However, although eWOM creates a basic information transfer between a source and a receiver, the dimensions of information quality in eWOM have rarely been addressed by scholars, nor how consumers perceive the source of information in online customer support communities.10

Information quality is defined as ‘the quality of the content of a consumer review from the perspective of information characteristics’ (p. 128).25 Within an online community, a community member’s decision to adopt the information provided through eWOM can be determined by the perceived quality of information they receive.10

Due to the lack of quality control mechanisms on the web, it is often difficult for online community members to make judgements on information quality, and therefore this evaluation is often based on subjective perceptions. In fact, online community members use personal strategies to raise questions, collect feedback and discriminate between quality answers and others,26 and it is their judgement of the message’s quality that determines how much they learn from and adopt the received message.27

However, research on information quality dimensions in eWOM has produced a plethora of dimensions that all contribute to defining a piece of information as of high quality. Therefore, different authors refer to and measure information quality using different constructs in eWOM research. For instance, Cheung et al.10 use comprehensiveness, timeliness, accuracy and relevance; Filieri and McLeay28 added two dimensions, such as value-added, which refers to the extent to which information is beneficial and empowers users from its use, and completeness, which is defined as the extent to which information is of sufficient breadth, depth and scope for the task at hand.29

Research on the determinants of information quality in eWOM in the context of online communities is scant. The only available study is that by Cheung et al. on a food community in Hong Kong.10 It found that information relevancy and completeness are the most important influencers of perceived information usefulness, while timeliness and accuracy were neither strong nor significant influencers of perceived information usefulness.

This research study aims to explore the dimensions that are used by consumers to evaluate the quality of information in an online company-driven customer service community.

Electronic WOM is a communication process between a sender and receiver,16 and therefore another important aspect that needs to be considered when investigating eWOM is the source of communication and its perceived credibility.10 Source credibility is the extent to which an information source is perceived to be believable, competent and trustworthy by information recipients.10 Research on source credibility began by looking at the influence of source credibility on communication effectiveness.30 However, source credibility attributes highlighted in traditional word-of-mouth research are difficult to assess within online communities, due to the anonymity of the authors of online messages.4 Cheung et al.13 suggest that it is therefore the responsibility of the receiver of the message to perceive how credible the eWOM communicator is by analysing salient cues about the source such as its reputation and credibility, either by the helpful votes provided by other eWOM users or by the number of postings the contributor has made within the online community.

Supporting this, in an online environment there is unlimited freedom for consumers to express their feelings towards certain products or services without disclosing their real identity.10 Other consumers are therefore left to determine the expertise and trustworthiness of the contributors in order to either adopt or reject the information presented. For both traditional and online media, source credibility has consistently included two major dimensions: source expertise and source trustworthiness.10, 16, 31, 32 The former refers to ‘the perceived competence of the source providing the information’ (p. 6),31 while the latter is defined as the ‘possible bias/incentives that may be reflected in the source’s information’ (p. 6).31

Existing research on eWOM has not given much attention to the perceived credibility of the source in the context of online communities. The only available study focuses on a food online community and found that source expertise and trustworthiness do not have an influence on perceived usefulness of information.10 This contradicts findings in the traditional WOM context but is understandable, taking into account online anonymity.33 However, how do consumers perceive the source of communications in online service support communities? This study aims to investigate consumer perceptions of the level of credibility of posts provided in online customer service communities.

Methodology

This study is concerned with gaining a rich and subjective insight into the perceptions of online community members of eWOM information and information sources, and therefore an interpretavist approach has been applied. The interpretavist philosophy advocates that, in order for the researcher to be able to understand the actions of the online community members, it is necessary to explore the subjective meanings that motivate their actions.34 The data obtained was therefore qualitative, based on meanings expressed through words,35 observed directly from the social actors within an online community. In line with the interpretavist approach, qualitative research was conducted that allowed the researcher to explore an online community in as real a manner as possible.36

A popular online customer service community, the Dell community (en.community.dell.com) was chosen as the basis for this case study. Dell was selected as it connects with more than 5.4 million customers every day, on the phone and in person, but particularly through the use of online communities.37 The company’s presence across many types of online communities, such as Twitter, LinkedIn, Facebook and its highly regarded web-based online community, accessible through the Dell-hosted website, has seen the firm recognized as the second most respected brand for breadth and depth of social media activities.38 It is the company’s commitment to customer feedback that led to Dell receiving the 2010 Forrester Research Voice of the Customer Award.39

Simultaneously, the internet and, in particular, online communities have been widely used as the focus for research and as a tool for data collection in marketing research. For this study, netnography — an interpretive method40 — was selected due to its qualitative research technique that adapts ethnography research methods to study computer-mediated communications.17, 41 Netnography has been identified as an established tool for marketing research based on studying internet-based or online communities,19 and uses largely textual data collected from field notes and artefacts of the online community, which in the case of this study are the postings by contributors.42

The Dell customer service community was selected as the focus for this study as it included a variety of threads about various topics important to consumers of Dell. The threads were publicly available for all internet users, with no need to register in order to read them. In addition, Dell is a well-known organization and, since 2006, more than 100,000 customer ratings and reviews have been shared via its online community, based on direct customer feedback.43 Therefore, the research team perceived that it would provide a rich assortment of data. This study sampled the ten largest threads (see Table 1) within the Dell online community as these were determined to present the main ‘communities of practice’ within the online community, due to their being contributed to the most.44 The ten threads spanned a time period between January 2004 and December 2010 and provided 950 postings of data collectively, which were all analysed by the authors of this study.

Table 1 Findings emerging from the analysis of threads in the Dell community

Findings

In order to code the messages that existed in the Dell online community and to separate the content into fixed units, each separate post within a thread was classified as a separate unit.45 Each post received a classification letter based on which thread it is from and a number based on its contribution.40 Examples and coding rules for each category were then determined to establish under what circumstances a post could be coded within a category and presented together within a coding agenda.46 Table 1 provides a summary of the findings.

In line with the coding agenda, for a thread to be categorized as ‘high relevance’, the majority of posts within the thread needed to be categorized as relevant to the thread topic. Posts unrelated to the thread topic were categorized as ‘low relevance’. All but two of the threads analysed were categorized as ‘high relevance’ as the majority of posts within the threads were of high relevance to the thread topic — threads B and C were categorized as ‘medium relevance’ as they had a relatively equal number of high- and low-relevance posts. Low-relevance posts within threads B and C tended to veer between the topics of ‘Long Wait for Studio XPS’ and ‘Next Steps for Community Site’, instead of focusing on issues such as the specifications of the laptops they had ordered in thread B, and how Dell should be focusing on improving both delivery times and customer service in thread C.

Interestingly, despite the topics of thread B and thread F being relatively similar with regard to the ‘Long wait’ and ‘order status’ of the Studio XPS, more posts were relevant within thread F than thread B. This could be due to the broader topic of thread F, which may provide more opportunities for posts to be categorized as relevant. Findings suggest that, by selecting a thread with a topic of interest, members within the Dell online community are able to locate relevant information and share their thoughts on issues they may be experiencing or that may be of interest to them. There is little evidence that members post information that is not relevant to the thread topic.

A thread was deemed to be accurate if there was high recognition by thread members of the accuracy of posts within the thread. A post was categorized as high accuracy if other members within the thread perceived it to be ‘correct and near to the true value’, while a post was categorized as low accuracy if other thread members perceived it to be incorrect. Salient cues such as star ratings available in the threads to judge accuracy were not used — we find that there is a lack of consistency in the use of them within the online community.13 Four of the threads (A, G, H and I) were regarded as high accuracy, with the remaining six threads (B, C, D, E, F and J) regarded as low accuracy.

Statements such as ‘I agree’, ‘you are right’ and ‘I’m glad to see us on common ground’ tended to be used within the threads to recognize the accuracy of a previous post, with the majority of posts categorized as high accuracy, including negative statements about Dell. The threads that were categorized as low accuracy included similar statements to those above; however, they were either few in number or categorized as low-accuracy posts. Posts recognizing the inaccuracy of other posts included statements such as ‘I think you’ll find you’re mistaken’, ‘I’m sorry, I have to disagree’ and ‘how dare you say that?’

The ‘low accuracy’ category for most threads could be explained by the fact that thread members are more likely to comment if they feel a post is inaccurate than accurate, or members may not comment on the accuracy of posts at all. Another explanation, which is potentially demonstrated in threads B, D, F and J, could be that, due to the nature of certain threads, members are more concerned with posting their own experiences/opinions as opposed to commenting on the accuracy of others’. Findings suggest that the accuracy of posts within a thread is not always recognized by other members; however, posts are more likely to be recognized if they are inaccurate than accurate.

This finding also means that users are more likely to comment on the inaccuracy of a previous comment rather than on its accuracy. Furthermore, posts that are categorized as accurate tend to include negative statements about Dell. It appears that accuracy is difficult for members to evaluate within the online community, especially because the star rating system provided is not used by community members. This is clear from the comment of a user that states, ‘Stars??? What are they for??? What good are they, especially if I give bad advice and then someone clicks all 5 stars??? They’re useless’. This post shows that users do not use this mechanism to evaluate the usefulness of information, probably because they are aware of how easy it is to manipulate such a mechanism. The rating mechanism has been created to help consumers to discern highly accurate comments from low-accuracy information, but apparently this is not useful.

For a thread to be determined as having comprehensiveness, the majority of posts within the thread needed to be categorized as having ‘high message comprehensiveness’. For a post to be categorized as having high comprehensiveness, it must be perceived to be complete and entire and provide all necessary information. The majority of the threads (A, D, E, F, G, H and J) were determined as having high comprehensiveness, while threads C and I were determined as having low comprehensiveness. Due to the equal number of high- and low-comprehensiveness posts, thread B was determined to have medium comprehensiveness.

To provide a complete and entire picture, certain posts categorized as high comprehensiveness included direct copies of communications with Dell Customer Services within their posts, such as letters and emails, while others, particularly posts related to waiting times for orders, included dates relevant to their situation, such as order, estimated and actual delivery dates. Many of the posts categorized as high comprehensiveness were negative towards Dell. The posts categorized as low comprehensiveness tended to make statements without providing further explanation. Within threads C and I, the majority of posts provided incomplete suggestions or information. However, in thread I, this could have been because the thread was between two members, and therefore they may not have felt the need for every post to be complete and entire. Findings confirmed that most of the information provided within the threads had high message comprehensiveness and that members can usually find complete and entire explanations behind statements made about Dell, its products and services.

For a thread to be categorized as having high trustworthiness, there needed to be a high recognition of source trustworthiness within the thread. In line with the coding agenda, for a member or a source related to the thread to be categorized as having high source trustworthiness, it was required for them to be perceived to be trustworthy and have good character by other thread members. All the threads but one were categorized as low trustworthiness — threads B, C and E because there was little recognition of high or low trustworthiness and threads A, D, F, G, H and J because there tended to be a higher recognition of posts categorized as low trustworthiness. Thread I was categorized as high trustworthiness as the first member appears to place trust in the second member by following the instructions provided to them throughout.

In almost every thread, questions similar to ‘Do you work for Dell’, ‘Are you a Dell employee’ and ‘…I have to wonder if this poster is an employee of Dell’ were asked, appearing to question the integrity of other members within the thread. Other than this, the main focus of lack of trust across all the threads tended to be directed towards the low trustworthiness of Dell. Examples included ‘I do not trust 100 per cent any advice I get from Dell’, ‘I can’t trust Dell with any future orders. so they’ve lost my business’ and ‘Dell will have to pay for the way they treat their customers, I strongly believe’. Such statements appear consistently throughout the majority of threads that portray Dell in a negative light. Findings suggest that distrust prevails in this online community; however, this could be because thread members do not necessarily always comment when they perceive another member to be trustworthy.

It is interesting to note that community members with a high number of postings were perceived as less trustworthy than members with one or few postings. For instance, one member posted, ‘I would think that the reason people may think you work for Dell is the fact that you’ve got over 3k posts to this forum’.

A thread was categorized as having high source expertise if there was a high demonstration or recognition of source expertise by other members within the thread. In line with the coding agenda, for a member or a source related to the thread to be categorized as having high source expertise, it was required for them to demonstrate or be perceived to have ‘great skill and knowledge’ by other thread members. While threads B, E, F, H and J were categorized as low expertise, threads C, D, G and I were categorized as high expertise and thread A was perceived to have medium expertise, as there was an equal number of posts recognizing the high and low expertise of other members within the thread.

A reason why threads C, D, G and I may demonstrate a high level of source expertise could be the nature of the thread topics, with thread members seeking advice, knowledge, skill and, ultimately, the expertise of other thread members, for example ‘how to cancel my order?????’ and ‘Vertical line on Inspiron 9300’, as opposed to the thread topic being a statement, such as ‘customer service sucks’ in thread H. Many of the posts categorized as low expertise tended to be negatively directed towards Dell employees, with statements indicating their lack of expertise, such as ‘That is how poorly trained they are…too many of their tech work off a script and don’t know what they are doing’ and ‘they either don’t know what they’re talking about or never answer’. Findings suggest that there are mixed perceptions about the expertise of the source in online customer service communities. In particular, community members perceive Dell to have low expertise when providing information about the quality of the service (eg customer’s perception of poor performance on delivery times).

Discussion and implications

By analysing Dell’s online community, this paper investigates how consumers assess information quality and the level of credibility of the sources of information in online customer service communities. It also highlights the challenges of managing a branded online service community.

This paper’s findings highlight that the members of the Dell online community evaluated as relevant and comprehensive the majority of information provided within the online community. Similar findings have been provided in a quantitative study on a Chinese online food community.10 Information accuracy demonstrated the lowest significance within this online community. This result might be due to members not posting their perceptions on information accuracy, rather than the information being perceived as inaccurate.

A star rating system in place within the online community is not used consistently enough, thus making it harder for members to determine accuracy. This is consistent with previous research10 suggesting that accuracy is difficult to evaluate. This result might be due to the fact that consumers are aware that helpful votes can be easily manipulated by spammers.47 However, if we consider the low level of trust towards the posters in the community, this may have influenced consumers’ perception of the accuracy of the information provided. In line with previous research,13 this paper found that source credibility is difficult to assess within online communities.

Source credibility is the member’s perception that a source is believable, competent and trustworthy.10 Based on the findings, the online community provides members with source expertise, when viewing a thread where the topic is of interest to them. As previously outlined, salient clues such as user biographies and the number of postings made by members were not used to measure source credibility as very few members completed them, and members with a high number of postings were sometimes perceived as untrustworthy within the online community. This finding accords with previous research,47 where top-ranked reviewers were considered less trustworthy compared to other reviews. This finding is in contrast to research that found that information cues could be used to judge credibility.17

This paper concludes that, although members did not always recognize the trustworthiness of other members, this was not an indicator of lack of trust. Despite perceived low trustworthiness of the source, the information communicated was itself often not deemed untrustworthy. This is supported by research that found that the electronic nature of online communities eliminates members’ ability to judge the credibility of a source of a message.2

Nevertheless, the perceived low trustworthiness was manifest in a high number of negative comments about Dell and its employees. This might be due to the fact that customers perceive the business purpose of the community, compared to other independent, customer-managed online communities. It seems that the business nature of the community is a sufficient reason to make users suspicious about the identity of those who reply to their messages or post anything that is not negative about Dell, which leads to posting of mainly negative comments.

From the analysis of the posts, a negative image of Dell as portrayed by its community members emerges. This image is visible to every user and could be passed on via eWOM to potential customers. Therefore, this study highlights the danger of managing online consumer service communities, where people are more likely to complain than to write positive comments about a company, and such negative eWOM can easily spread and provide a negative image of the company to other users.

Furthermore, the online community provides a platform for members to share their opinions and advice, be it positive or negative, with like-minded people who share an interest in Dell. The analysis identifies that having an online community linked to the company’s website can provide both benefits and drawbacks. If negative information about Dell is provided by members within the online community, this offers a place for negative eWOM to begin. Findings demonstrate how posts that were categorized as high accuracy within the online community were often negative about Dell, which could negatively impact Dell’s brand image.

Moreover, if members agree that negative information provided by other members about Dell, its products and services is accurate, this could potentially encourage negative eWOM. In addition, since customers are more eager to complain and thus write more negative reviews than positive, as shown by this and other studies,48, 49, 50 the platform might negatively affect the attitude of customers towards Dell-branded products. In fact, other customers who are planning to buy a Dell product might change their mind if they visit the community and find a higher level of criticism in the threads. Although we have not identified the effect that negative eWOM has on consumer behaviour, we believe this would be a fruitful line of future research on customer service communities.

Based on the findings, if a message is negative towards Dell in its online community, then it is in Dell’s interest for it to have low comprehensiveness and be scarcely relevant as this provides cues about the inaccuracy of the source. Likewise, if a message is positive, it is in Dell’s interest for it to be high comprehensiveness and relevance, so that it strengthens the credibility and persuasiveness of the information. An important suggestion for online community managers is to develop some tools (eg badge system) to control comments in the online community in order to make some positive comments appear highly relevant and comprehensive and cause negative comments to be classed as irrelevant. A solution could be to engage Dell’s employees in a different section of the company (eg supply chain, marketing, research and development) to write competent, comprehensive, detailed, factual comments that are useful to users so that the negative comments, which are often vacuous and emotional, will be marked as irrelevant and unreliable by community members. This way Dell could reduce the negative impact of the messages posted by angry customers in the community.

Moreover, a customer service platform can also work in a company’s favour in terms of positive eWOM and provide an opportunity for Dell to monitor customers’ current issues. The findings highlight how most information is current. Therefore, Dell can determine exactly what the community members feel in real time, potentially giving them a competitive advantage over competitors who do not have access to this kind of customer feedback. The unprecedented speed of transmission over the internet4 and the extension of members’ WOM network from immediate contacts to the entire internet world13 demonstrate just how powerful eWOM can be, be it positive or negative about a company. In addition, this study highlights how thread content can provide invaluable data on a range of business issues, including service performance levels, product development, customer satisfaction and general feedback. However, further research is required to understand how information influences consumer behaviour.

Finally, contributing to the literature on netnography as a research method to study an online community, this paper offers evidence of its usefulness for researchers wishing to gain deeper insights into online community members’ thoughts, feelings and relevant issues without relying on the usual post-hoc, reflective questionnaire surveys.

One of the major limitations of this research is that it uses posts from the Dell community referring to the period 2004–2010;since then, the nature and type of communications may have changed.