19818615
CustomerApi.Jobs.GenerateAccountMetrics
Queue
clickhouse_account_metrics
Attempt
2 of 10
Priority
0
Tags
...
Node
customer_api@10.10.1.214
Queue Time
12:59.016
Run Time
00:00.460
Inserted
4h ago
Scheduled
3h ago
Completed
3h ago (00:01)
Cancelled
—
Discarded
—
Args
%{
"account_id" => "35027",
"date" => "2026-03-06",
"query" => "users_reached",
"window_days" => 1
}
Meta
%{
"deps" => ["generate_event_counts"],
"name" => "generate_users_reached",
"on_hold" => false,
"orig_scheduled_at" => 1772759446775217,
"recorded" => true,
"return" => "g1AAAABReJwrYWBgYC7nLS1OLSqOL0pNTM5ITUlkLpdOKs3MSUktik8tS80riUeVZiiXyCsoji8uLSpLrUSTYwcAj+Md5Q",
"structured" => true,
"workflow" => true,
"workflow_id" => "019cc07f-78bc-734c-8220-dfbf7c9a9134"
}
Recorded Output
%{users_reached: 3, builder_event_users_reached: 0, nps_survey_users_reached: 7}
Errors
Attempt 1—3h ago
** (FunctionClauseError) no function clause matching in Ch.RowBinary.decode_names/4
The following arguments were given to Ch.RowBinary.decode_names/4:
# 1
"ild))\n"
# 2
45
# 3
67
# 4
["eadPool, algorithm: Thread). (CANNOT_OPEN_FILE) (version 25.10.1.7375 (official bu", "8582: While executing MergeTreeSelect(pool: Prefetched", "ryable error happened, or data is corrupted. Please retry the query: While reading part all_24202139_24567826_35_245", "t to read data part all_24202139_24567826_35_24568582 (state Active) failed with error. It can mean that some re", "ynchronousBoundedReadBuffer, compressed data header: <uninitialized>): (while reading column _block_number): (attem", "bcb-8441591d3836/mergetree/hrf/eiowpkjdqhdxxfcrtgbqawyrmgufm (position: 189204146, typename: DB::A", "ff663463:1089, kind: Regular, unbound: 0: While reading or decompressing ch-s3-8c5/aa8129e2-f47e-4b75-", "e-8e71-4f7fff663463:1089, current write offset: 188743680, caller id: 9c47568e-dd24-44ae-8e71-4f7", "INUATION, downloaded size: 0, reserved size: 460466, downloader id: 9c47568e-dd24-44", "ment: [188743680, 192937983], key: 2809fcb6bad37bfa57262c4d487f0d9b, state: PARTIALLY_DOWNLOADED_NO_CON", "4f7fff663463:1089, file segment info: File se", "ternal buffer end: 189792256, read_type: REMOTE_FS_READ_AND_PUT_IN_CACHE, last caller: 9c47568e-dd24-44ae-8e71", "b6bad37bfa57262c4d487f0d9b, file_offset_of_buffer_end: 189204146, read_until_position: 189235884, i", " path: ch-s3-8c5/aa8129e2-f47e-4b75-bbcb-8441591d3836/mergetree/hrf/eiowpkjdqhdxxfcrtgbqawyrmgufm, hash key: 2809f", "089, kind: Regular, unbound: 0: Cache info: Buffe", "743680, caller id: 9c47568e-dd24-44ae-8e71-4f7fff663463:", "4ae-8e71-4f7fff663463:1089, current write offset: 18", "ADING, downloaded size: 0, reserved size: 460466, downloader id: 9c47568e-dd24-", "tem, current cache state: File segment: [188743680, 192937983], key: 2809fcb6bad37bfa57262c4d487f0d9b, state: DOWNL", "aredS3DiskCache/280/2809fcb6bad37bfa57262c4d487f0d9b/188743680: , errno: 30, strerror: Read-only file sy", "0d9b/188743680: , errno: 30, strerror: Read-only file system: Cannot open file /mnt/clickhouse-cache/s", "de: 76. DB::Exception: Cannot open file /mnt/clickhouse-cache/sharedS3DiskCache/280/2809fcb6bad37bfa57262c4d487"]
(ch 0.7.1) lib/ch/row_binary.ex:789: Ch.RowBinary.decode_names/4
(ch 0.7.1) lib/ch/query.ex:290: DBConnection.Query.Ch.Query.decode/3
(db_connection 2.9.0) lib/db_connection.ex:1470: DBConnection.decode/4
(db_connection 2.9.0) lib/db_connection.ex:849: DBConnection.execute/4
(ch 0.7.1) lib/ch.ex:94: Ch.query/4
(ecto_sql 3.13.4) lib/ecto/adapters/sql.ex:620: Ecto.Adapters.SQL.query!/4
(ecto_ch 0.8.6) lib/ecto/adapters/clickhouse.ex:323: Ecto.Adapters.ClickHouse.execute/5
(ecto 3.13.5) lib/ecto/repo/queryable.ex:241: Ecto.Repo.Queryable.execute/4