Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Hashtags and followers: An experimental study of the online social network Twitter
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
Twitter Inc, San Francisco, CA USA..
2016 (English)In: SOCIAL NETWORK ANALYSIS AND MINING, ISSN 1869-5450, Vol. 6, no 1, article id UNSP 12Article in journal (Refereed) Published
Abstract [en]

We have conducted an analysis of data from 502,891 Twitter users and focused on investigating the potential correlation between hashtags and the increase of followers to determine whether the addition of hashtags to tweets produces new followers. We have designed an experiment with two groups of users: one tweeting with random hashtags and one tweeting without hashtags. The results showed that there is a correlation between hashtags and followers: on average, users tweeting with hashtags increased their followers by 2.88, while users tweeting without hashtags increased 0.88 followers. We present a simple, reproducible approach to extract and analyze Twitter user data for this and similar purposes.

Place, publisher, year, edition, pages
Springer, 2016. Vol. 6, no 1, article id UNSP 12
Keywords [en]
Experimental study, Correlational analysis, Hashtags, Followers
National Category
Media and Communication Technology Other Computer and Information Science
Identifiers
URN: urn:nbn:se:bth-13048DOI: 10.1007/s13278-016-0320-6ISI: 000381220500012OAI: oai:DiVA.org:bth-13048DiVA, id: diva2:1006904
Part of project
Bigdata@BTH- Scalable resource-efficient systems for big data analytics, Knowledge FoundationAvailable from: 2016-09-30 Created: 2016-09-30 Last updated: 2021-05-05Bibliographically approved
In thesis
1. Extraction and Energy Efficient Processing of Streaming Data
Open this publication in new window or tab >>Extraction and Energy Efficient Processing of Streaming Data
2017 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

The interest in machine learning algorithms is increasing, in parallel with the advancements in hardware and software required to mine large-scale datasets. Machine learning algorithms account for a significant amount of energy consumed in data centers, which impacts the global energy consumption. However, machine learning algorithms are optimized towards predictive performance and scalability. Algorithms with low energy consumption are necessary for embedded systems and other resource constrained devices; and desirable for platforms that require many computations, such as data centers. Data stream mining investigates how to process potentially infinite streams of data without the need to store all the data. This ability is particularly useful for companies that are generating data at a high rate, such as social networks.

This thesis investigates algorithms in the data stream mining domain from an energy efficiency perspective. The thesis comprises of two parts. The first part explores how to extract and analyze data from Twitter, with a pilot study that investigates a correlation between hashtags and followers. The second and main part investigates how energy is consumed and optimized in an online learning algorithm, suitable for data stream mining tasks.

The second part of the thesis focuses on analyzing, understanding, and reformulating the Very Fast Decision Tree (VFDT) algorithm, the original Hoeffding tree algorithm, into an energy efficient version. It presents three key contributions. First, it shows how energy varies in the VFDT from a high-level view by tuning different parameters. Second, it presents a methodology to identify energy bottlenecks in machine learning algorithms, by portraying the functions of the VFDT that consume the largest amount of energy. Third, it introduces dynamic parameter adaptation for Hoeffding trees, a method to dynamically adapt the parameters of Hoeffding trees to reduce their energy consumption. The results show an average energy reduction of 23% on the VFDT algorithm.

Place, publisher, year, edition, pages
Karlskrona: Blekinge Tekniska Högskola, 2017
Series
Blekinge Institute of Technology Licentiate Dissertation Series, ISSN 1650-2140 ; 3
Keywords
machine learning, green computing, data mining, data stream mining, green machine learning
National Category
Computer Sciences
Identifiers
urn:nbn:se:bth-15532 (URN)
Presentation
2017-12-18, J1640, Blekinge Tekniska Högskola, 371 79, Karlskrona, 13:00 (English)
Opponent
Supervisors
Projects
Scalable resource-efficient systems for big data analytics
Funder
Knowledge Foundation, 20140032
Available from: 2017-11-22 Created: 2017-11-22 Last updated: 2018-01-13Bibliographically approved

Open Access in DiVA

fulltext(2607 kB)13557 downloads
File information
File name FULLTEXT01.pdfFile size 2607 kBChecksum SHA-512
85196af4bf42611a12d9303c75ea78b441da78e3350aad12e4cf829d29f4ff87cba85ba0145ac608471d4bf69018ab512381e8c52ce6316f24eeb0834cc009e0
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records

Martin, Eva GarciaLavesson, Niklas

Search in DiVA

By author/editor
Martin, Eva GarciaLavesson, Niklas
By organisation
Department of Computer Science and Engineering
Media and Communication TechnologyOther Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 13557 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 720 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf