19815883
CustomerApi.Jobs.GenerateAccountMetrics
Queue
clickhouse_account_metrics
Attempt
2 of 10
Priority
0
Tags
...
Node
customer_api@10.10.0.170
Queue Time
13:50.882
Run Time
00:02.679
Inserted
12h ago
Scheduled
10h ago
Completed
10h ago (00:03)
Cancelled
—
Discarded
—
Args
%{
"account_id" => "94583",
"date" => "2026-03-06",
"query" => "users_reached",
"window_days" => 7
}
Meta
%{
"deps" => ["generate_event_counts"],
"name" => "generate_users_reached",
"on_hold" => false,
"orig_scheduled_at" => 1772759131825910,
"recorded" => true,
"return" => "g1AAAABUeJwrYWBgYC7nLS1OLSqOL0pNTM5ITUl0LZdOKs3MSUktik8tS80riUeRTmJg4BQql8grKI4vLi0qS61ElU5kAADbzx48",
"structured" => true,
"workflow" => true,
"workflow_id" => "019cc07f-7875-7f1f-914a-446f69f3aa99"
}
Recorded Output
%{
users_reached: 69,
builder_event_users_reached: 2322,
nps_survey_users_reached: 0
}
Errors
Attempt 1—10h ago
** (FunctionClauseError) no function clause matching in Ch.RowBinary.decode_names/4
The following arguments were given to Ch.RowBinary.decode_names/4:
# 1
"algorithm: Thread). (CANNOT_OPEN_FILE) (version 25.10.1.7375 (official build))\n"
# 2
42
# 3
67
# 4
[" is corrupted. Please retry the query: While executing MergeTreeSelect(pool: PrefetchedReadPool, ", "retryable error happened, or dat", " failed with error. It can mean that some", "le reading column attributes): (attempt to read data part all_22284904_22717272_59_22712811 (state Active", " (position: 7888693116, typename: DB::AsynchronousBoundedReadBuffer, compressed data header: <uninitialized>): (wh", "compressing ch-s3-bfe/aa8129e2-f47e-4b75-bbcb-8441591d3836/mergetree/mhl/zoyfsiydchbjoddekclxlygyiuuz", "27, kind: Regular, unbound: 0: While reading or d", "20, caller id: 529b8951-3ea7-4ef0-b318-c31985f3fcf9:2", "18-c31985f3fcf9:2127, current write offset: 7885291", "ION, downloaded size: 0, reserved size: 1048576, downloader id: 529b8951-3ea7-4ef0-b", "PARTIALLY_DOWNLOADED_NO_CONTINUA", "89485823], key: cfcde61893b81e4dcf4d018d05fcf52e, state:", "7, file segment info: File segment: [7885291520, 7", " end: 7886340096, read_type: REMOTE_FS_READ_AND_PUT_IN_CACHE, last caller: 529b8951-3ea7-4ef0-b318-c31985f3fcf9:21", "93116, read_until_position: 7888979526, internal buffe", "hl/zoyfsiydchbjoddekclxlygyiuuzr, hash key: cfcde61893b81e4dcf4d018d05fcf52e, file_offset_of_buffer_end: 7888", " ch-s3-bfe/aa8129e2-f47e-4b75-bbcb-8441591d3836/mergetree/", ":2127, kind: Regular, unbound: 0: Cache info: Buffer path", "85291520, caller id: 529b8951-3ea7-4ef0-b318-c31985f3fcf", "rved size: 1048576, downloader id: 529b8951-3ea7-4ef0-b318-c31985f3fcf9:2127, current write offset: 7", " state: DOWNLOADING, downloaded size: 0, res", "e system, current cache state: File segment: [7885291520, 7889485823], key: cfcde61893b81e4dcf4d018d05fcf52e", "e/sharedS3DiskCache/cfc/cfcde61893b81e4dcf4d018d05fcf52e/7885291520: , errno: 30, strerror: Read-only fi", "f52e/7885291520: , errno: 30, strerror: Read-only file system: Cannot open file /mnt/clickhouse-cac", "de: 76. DB::Exception: Cannot open file /mnt/clickhouse-cache/sharedS3DiskCache/cfc/cfcde61893b81e4dcf4d018d05f"]
(ch 0.7.1) lib/ch/row_binary.ex:789: Ch.RowBinary.decode_names/4
(ch 0.7.1) lib/ch/query.ex:290: DBConnection.Query.Ch.Query.decode/3
(db_connection 2.9.0) lib/db_connection.ex:1470: DBConnection.decode/4
(db_connection 2.9.0) lib/db_connection.ex:849: DBConnection.execute/4
(ch 0.7.1) lib/ch.ex:94: Ch.query/4
(ecto_sql 3.13.4) lib/ecto/adapters/sql.ex:620: Ecto.Adapters.SQL.query!/4
(ecto_ch 0.8.6) lib/ecto/adapters/clickhouse.ex:323: Ecto.Adapters.ClickHouse.execute/5
(ecto 3.13.5) lib/ecto/repo/queryable.ex:241: Ecto.Repo.Queryable.execute/4