gpt4 book ai didi

R情感分析; 'lexicon' 未找到; 'sentiments' 已损坏?

转载 作者:行者123 更新时间:2023-12-01 16:54:23 25 4
gpt4 key购买 nike

我正在尝试关注this情感分析在线教程。代码:

new_sentiments <- sentiments %>% #From the tidytext package
filter(lexicon != "loughran") %>% #Remove the finance lexicon
mutate( sentiment = ifelse(lexicon == "AFINN" & score >= 0, "positive",
ifelse(lexicon == "AFINN" & score < 0,
"negative", sentiment))) %>%
group_by(lexicon) %>%
mutate(words_in_lexicon = n_distinct(word)) %>%
ungroup()

产生错误:

>Error in filter_impl(.data, quo) : 
>Evaluation error: object 'lexicon' not found.

相关的,也许是在我看来,“情绪”表表现得很奇怪(已损坏?)。这是“情绪”的要点:

> head(sentiments,3)
> element_id sentence_id word_count sentiment
> chapter
> 1 1 1 7 0 The First Book of Moses:
> Called Genesis
> 2 2 1 NA 0 The First Book of Moses:
> Called Genesis
> 3 3 1 NA 0 The First Book of Moses: >
> Called Genesis
> category
> 1 The First Book of Moses: Called Genesis
> 2 The First Book of Moses: Called Genesis
> 3 The First Book of Moses: Called Genesis

如果我对 bing、AFINN 或 NRC 使用 Get_Sentiments,我会得到看起来合适的响应:

>  get_sentiments("bing")
> # A tibble: 6,788 x 2
> word sentiment
> <chr> <chr> > 1 2-faced negative
> 2 2-faces negative
> 3 a+ positive
> 4 abnormal negative

我尝试删除(remove.packages)并重新安装 tidytext;行为没有改变。我正在运行 R 3.5

即使我完全误解了这个问题,我也会很感激任何人能给我的任何见解。

最佳答案

以下说明将修复 new_sentiments 数据集,如 Data Camp tutorial 中所示。 .

bing <- get_sentiments("bing") %>% 
mutate(lexicon = "bing",
words_in_lexicon = n_distinct(word))

nrc <- get_sentiments("nrc") %>%
mutate(lexicon = "nrc",
words_in_lexicon = n_distinct(word))

afinn <- get_sentiments("afinn") %>%
mutate(lexicon = "afinn",
words_in_lexicon = n_distinct(word))

new_sentiments <- bind_rows(bing, nrc, afinn)
names(new_sentiments)[names(new_sentiments) == 'value'] <- 'score'
new_sentiments %>%
group_by(lexicon, sentiment, words_in_lexicon) %>%
summarise(distinct_words = n_distinct(word)) %>%
ungroup() %>%
spread(sentiment, distinct_words) %>%
mutate(lexicon = color_tile("lightblue", "lightblue")(lexicon),
words_in_lexicon = color_bar("lightpink")(words_in_lexicon)) %>%
my_kable_styling(caption = "Word Counts per Lexicon")

后续图表也将起作用!

关于R情感分析; 'lexicon' 未找到; 'sentiments' 已损坏?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51127671/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com