- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我有一个简单的 Airflow DAG,它只有一个任务 - stream_from_twitter_to_kafka
这是 DAG 的代码:
default_args = {
"owner": "me",
"depends_on_past": False,
"start_date": datetime(2020, 1, 20),
"email": ["makalaaneesh18@mail.com"],
"email_on_failure": False,
"email_on_retry": False,
"retries": 0,
"retry_delay": timedelta(minutes=1),
}
NO_OF_TWEETS_TO_STREAM = 100
with DAG("stream_from_twitter",
catchup=False,
default_args=default_args,
schedule_interval="@hourly") as dag:
task1 = PythonOperator(task_id="stream_from_twitter_to_kafka",
python_callable=read_stream_of_tweets,
op_args=(NO_OF_TWEETS_TO_STREAM,))
task1
read_stream_of_tweets
的代码使用 tweepy 读取传入的推文流并发布到 kafka 主题:
# override tweepy.StreamListener to add logic to on_status
class MyStreamListener(tweepy.StreamListener):
def __init__(self, *args, **kwargs):
self.num_tweets = kwargs.pop('num_tweets')
self.current_num_tweets = 0
super(MyStreamListener, self).__init__(*args, **kwargs)
self.kafka_producer = MyKafkaProducer()
def on_status(self, status):
if self.current_num_tweets >= self.num_tweets:
# Limiting to a number.
return False
if not status.text.startswith("RT"):
print(status.text)
status_info = {
'id': status.id_str,
'text': status.text
}
self.kafka_producer.publish_message(TOPIC_PUB, value=status_info)
self.current_num_tweets = self.current_num_tweets + 1
def on_error(self, status_code):
if status_code == 420:
# returning False in on_data disconnects the stream
return False
def read_stream_of_tweets(n):
auth = tweepy.OAuthHandler(consumer_token,
consumer_secret)
auth.set_access_token(access_token,
access_secret)
myStreamListener = MyStreamListener(num_tweets=n)
myStream = tweepy.Stream(auth=auth,
listener=myStreamListener)
myStream.filter(track=['life'], languages=['en'])
*** Reading local file: /Users/aneeshmakala/Documents/ComputerScience/datascience/hapPy/airflow/logs/stream_from_twitter/stream_from_twitter_to_kafka/2020-01-20T12:27:48.408593+00:00/1.log
[2020-01-20 17:58:27,264] {base_task_runner.py:61} DEBUG - Planning to run as the user
[2020-01-20 17:58:27,272] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not have depends_on_past set.
[2020-01-20 17:58:27,272] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Not In Retry Period' PASSED: True, The task instance was not marked for retrying.
[2020-01-20 17:58:27,273] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not have any upstream tasks.
[2020-01-20 17:58:27,273] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Task Instance State' PASSED: True, Task state queued was valid.
[2020-01-20 17:58:27,273] {taskinstance.py:655} INFO - Dependencies all met for <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]>
[2020-01-20 17:58:27,277] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Pool Slots Available' PASSED: True, ('There are enough open slots in %s to execute the task', 'default_pool')
[2020-01-20 17:58:27,280] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Previous Dagrun State' PASSED: True, The task did not have depends_on_past set.
[2020-01-20 17:58:27,280] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Not In Retry Period' PASSED: True, The task instance was not marked for retrying.
[2020-01-20 17:58:27,280] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Task Concurrency' PASSED: True, Task concurrency is not set.
[2020-01-20 17:58:27,280] {taskinstance.py:672} DEBUG - <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]> dependency 'Trigger Rule' PASSED: True, The task instance did not have any upstream tasks.
[2020-01-20 17:58:27,280] {taskinstance.py:655} INFO - Dependencies all met for <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [queued]>
[2020-01-20 17:58:27,280] {taskinstance.py:866} INFO -
--------------------------------------------------------------------------------
[2020-01-20 17:58:27,280] {taskinstance.py:867} INFO - Starting attempt 1 of 1
[2020-01-20 17:58:27,280] {taskinstance.py:868} INFO -
--------------------------------------------------------------------------------
[2020-01-20 17:58:27,286] {taskinstance.py:887} INFO - Executing <Task(PythonOperator): stream_from_twitter_to_kafka> on 2020-01-20T12:27:48.408593+00:00
[2020-01-20 17:58:27,288] {standard_task_runner.py:52} INFO - Started process 11912 to run task
[2020-01-20 17:58:27,315] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,314] {cli_action_loggers.py:68} DEBUG - Calling callbacks: [<function default_action_log at 0x10da70830>]
[2020-01-20 17:58:27,326] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,326] {settings.py:213} DEBUG - Setting up DB connection pool (PID 11912)
[2020-01-20 17:58:27,327] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,326] {settings.py:221} DEBUG - settings.configure_orm(): Using NullPool
[2020-01-20 17:58:27,329] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,329] {dagbag.py:403} INFO - Filling up the DagBag from /Users/aneeshmakala/Documents/ComputerScience/datascience/hapPy/airflow/dags/stream_from_twitter.py
[2020-01-20 17:58:27,330] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,330] {dagbag.py:232} DEBUG - Importing /Users/aneeshmakala/Documents/ComputerScience/datascience/hapPy/airflow/dags/stream_from_twitter.py
[2020-01-20 17:58:27,332] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,332] {dagbag.py:370} DEBUG - Loaded DAG <DAG: stream_from_twitter>
[2020-01-20 17:58:27,351] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: stream_from_twitter.stream_from_twitter_to_kafka 2020-01-20T12:27:48.408593+00:00 [running]> 1.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.ip6.arpa
[2020-01-20 17:58:27,364] {__init__.py:97} DEBUG - Preparing lineage inlets and outlets
[2020-01-20 17:58:27,364] {__init__.py:133} DEBUG - inlets: [], outlets: []
[2020-01-20 17:58:27,364] {python_operator.py:105} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_EMAIL=makalaaneesh18@mail.com
AIRFLOW_CTX_DAG_OWNER=me
AIRFLOW_CTX_DAG_ID=stream_from_twitter
AIRFLOW_CTX_TASK_ID=stream_from_twitter_to_kafka
AIRFLOW_CTX_EXECUTION_DATE=2020-01-20T12:27:48.408593+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2020-01-20T12:27:48.408593+00:00
[2020-01-20 17:58:27,367] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,366] {oauth1_auth.py:77} DEBUG - Signing request <PreparedRequest [POST]> using client <Client client_key=XXXXXX, client_secret=****, resource_owner_key=XXXXXX, resource_owner_secret=****, signature_method=HMAC-SHA1, signature_type=AUTH_HEADER, callback_uri=None, rsa_key=None, verifier=None, realm=None, encoding=utf-8, decoding=None, nonce=None, timestamp=None>
[2020-01-20 17:58:27,368] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,367] {oauth1_auth.py:93} DEBUG - Including body in call to sign: True
[2020-01-20 17:58:27,369] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,369] {__init__.py:133} DEBUG - Collected params: [('delimited', 'length'), ('oauth_nonce', 'XXXXXX'), ('oauth_timestamp', '1579523307'), ('oauth_version', '1.0'), ('oauth_signature_method', 'HMAC-SHA1'), ('oauth_consumer_key', 'XXXXXX'), ('oauth_token', 'XXXXXX'), ('track', 'life'), ('language', 'en')]
[2020-01-20 17:58:27,370] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,370] {__init__.py:137} DEBUG - Normalized params: delimited=length&language=en&oauth_consumer_key=XXXXXX&oauth_nonce=XXXXXX&oauth_signature_method=HMAC-SHA1&oauth_timestamp=1579523307&oauth_token=XXXXXX&oauth_version=1.0&track=life
[2020-01-20 17:58:27,370] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,370] {__init__.py:138} DEBUG - Normalized URI: https://stream.twitter.com/1.1/statuses/filter.json
[2020-01-20 17:58:27,371] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,371] {__init__.py:143} DEBUG - Signing: signature base string: POST&https%3A%2F%2Fstream.twitter.com%2F1.1%2Fstatuses%2Ffilter.json&delimited%3Dlength%26language%3Den%26oauth_consumer_key%3DXXXXXX%26oauth_nonce%3DXXXXXX%26oauth_signature_method%3DHMAC-SHA1%26oauth_timestamp%3D1579523307%26oauth_token%3DXXXXXX%26oauth_version%3D1.0%26track%3Dlife
[2020-01-20 17:58:27,371] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,371] {__init__.py:150} DEBUG - Signature: JEwre9zNc+Ge6ezoGop6oXpp5Js=
[2020-01-20 17:58:27,372] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,372] {oauth1_auth.py:114} DEBUG - Updated url: https://stream.twitter.com/1.1/statuses/filter.json?delimited=length
[2020-01-20 17:58:27,372] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,372] {oauth1_auth.py:115} DEBUG - Updated headers: {'Content-Type': 'application/x-www-form-urlencoded', 'Content-Length': '22', 'Authorization': 'OAuth oauth_nonce="XXXXXX", oauth_timestamp="1579523307", oauth_version="1.0", oauth_signature_method="HMAC-SHA1", oauth_consumer_key="XXXXXX", oauth_token="XXXXXX", oauth_signature="JEwre9zNc%2BGe6ezoGop6oXpp5Js%3D"'}
[2020-01-20 17:58:27,373] {logging_mixin.py:112} INFO - [2020-01-20 17:58:27,373] {oauth1_auth.py:116} DEBUG - Updated body: '\x1b[1mtrack=life&language=en\x1b[0m'
[2020-01-20 17:58:32,274] {logging_mixin.py:112} INFO - [2020-01-20 17:58:32,273] {base_job.py:200} DEBUG - [heartbeat]
[2020-01-20 17:58:32,274] {logging_mixin.py:112} INFO - [2020-01-20 17:58:32,274] {local_task_job.py:124} DEBUG - Time since last heartbeat(0.01 s) < heartrate(5.0 s), sleeping for 4.990854 s
[2020-01-20 17:58:37,265] {logging_mixin.py:112} INFO - [2020-01-20 17:58:37,265] {local_task_job.py:103} INFO - Task exited with return code -6
airflow test stream_from_twitter stream_from_twitter_to_kafka 20200120
离线手动测试任务奇迹般有效。但是当由调度程序运行时,它会失败并显示上述日志。
最佳答案
这与airflow
无关或 tweepy
.
此问题特定于 Mac OS High Sierra 及更高版本。 https://stackoverflow.com/a/52230415/4434664解决了我的问题。
基本上,airflow test
仅在进程中运行任务,但调度程序将启动一个工作进程,该进程将调用 fork()
,显然,High Sierra 引入了一些破坏性的新安全更改 fork()
python中的用法。
这也导致了ansible的问题。引用 https://github.com/ansible/ansible/issues/32499#issuecomment-341578864
关于python - 运行 tweepy 的 Airflow 任务退出并返回代码 -6,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/59821777/
Task.WaitAll 方法等待所有任务,Task.WaitAny 方法等待一个任务。如何等待任意N个任务? 用例:下载搜索结果页面,每个结果都需要一个单独的任务来下载和处理。如果我使用 WaitA
我正在查看一些像这样的遗留 C# 代码: await Task.Run(() => { _logger.LogException(LogLevel.Error, mes
如何在 Linux 中运行 cron 任务? 关注此Q&A ,我有这个 cron 任务要运行 - 只是将一些信息写入 txt 文件, // /var/www/cron.php $myfile = fo
原谅我的新手问题,但我想按顺序执行三个任务并在剧本中使用两个角色: 任务 角色 任务 角色 任务 这是我到目前为止(任务,角色,任务): --- - name: Task Role Task ho
我有一个依赖于 installDist 的自定义任务 - 不仅用于执行,还依赖于 installDist 输出: project.task('run', type: JavaExec, depends
从使用 Wix 创建的 MSI 运行卸载时,我需要在尝试删除任何文件之前强行终止在后台运行的进程。主要应用程序由一个托盘图标组成,它反射(reflect)了 bg 进程监控本地 Windows 服务的
我想编写 Ant 任务来自动执行启动服务器的任务,然后使用我的应用程序的 URL 打开 Internet Explorer。 显然我必须执行 startServer先任务,然后 startApplic
使用 ASP.NET 4.5,我正在尝试使用新的 async/await 玩具。我有一个 IDataReader 实现类,它包装了一个特定于供应商的阅读器(如 SqlDatareader)。我有一个简
使用命令 gradle tasks可以得到一份所有可用任务的报告。有什么方法可以向此命令添加参数并按任务组过滤任务。 我想发出类似 gradle tasks group:Demo 的命令筛选所有任务并
除了sshexec,还有什么办法吗?任务要做到这一点?我知道您可以使用 scp 复制文件任务。但是,我需要执行其他操作,例如检查是否存在某些文件夹,然后将其删除。我想使用类似 condition 的东
假设我有字符串 - "D:\ApEx_Schema\Functions\new.sql@@\main\ONEVIEW_Integration\3" 我需要将以下内容提取到 diff 变量中 - 文档名
我需要编写一个 ant 任务来确定某个文件是否是只读的,如果是,则失败。我想避免使用自定义选择器来为我们的构建系统的性质做这件事。任何人都有任何想法如何去做?我正在使用 ant 1.8 + ant-c
这是一个相当普遍的计算机科学问题,并不特定于任何操作系统或框架。 因此,我对与在线程池上切换任务相关的开销感到有些困惑。在许多情况下,给每个作业分配自己的特定线程是没有意义的(我们不想创建太多硬件线程
我正在使用以下 Ansible playbook 一次性关闭远程 Ubuntu 主机列表: - hosts: my_hosts become: yes remote_user: my_user
如何更改 Ant 中的当前工作目录? Ant documentation没有 任务,在我看来,最好的做法是不要更改当前工作目录。 但让我们假设我们仍然想这样做——你会如何做到这一点?谢谢! 最佳答案
是否可以运行 cronjob每三天一次?或者也许每月 10 次。 最佳答案 每三天运行一次 - 或更短时间在月底运行一次。 (如果上个月有 31 天,它将连续运行 2 天。) 0 0 */3 * *
如何在 Gradle 任务中执行托管在存储库中的工具? 在我的具体情况下,我正在使用 Gradle 构建一个 Android 应用程序。我添加了一项任务,将一些 protobuf 数据从文本编码为二进
我的项目有下一个结构: Root |- A |- C (depends on A) \- B (depends on A) 对于所有子项目,我们使用自己的插件生成资源:https://githu
我设置了一个具有4个节点的Hadoop群集,其中一个充当HDFS的NameNode以及Yarn主节点。该节点也是最强大的。 现在,我分发了2个文本文件,一个在node01(名称节点)上,一个在node
在 TFS 2010 中为多个用户存储任务的最佳方式是什么?我只能为一项任务分配一个。 (例如:当我计划向所有开发人员演示时) (这是一个 Scrum Msf 敏捷项目,其中任务是用户故事的一部分)
我是一名优秀的程序员,十分优秀!