You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If the JSON dataset to ingest contains \u0000 unicode char, the ingestion process fails with a PostgreSQL error. Encountered this when tried to upload data collected with SharpHound flag "collectallproperties" enabled.
Component(s) Affected:
UI
API
Neo4j
PostgreSQL
Data Collector (SharpHound, AzureHound)
Other (tooling, documentation, etc.)
Steps to Reproduce:
Collect data with SharpHound "collectallproperties" (meaning collect all LDAP properties) enabled.
Upload it in BloodHound CE (used the UI for uploading the JSON files).
Ingestion goes to "fail" status.
app-db-1 container throws PostgreSQL error in the docker logs
Expected Behavior:
It is expected to upload and ingest data successfully even if it contains special unicode chars.
Actual Behavior:
Currently the \u0000 unicode sequence implies an ingestion error.
Screenshots/Code Snippets/Sample Files:
Here is the error message in the logs:
app-db-1 | 2023-12-14 00:50:23.632 UTC [91] ERROR: unsupported Unicode escape sequence
app-db-1 | 2023-12-14 00:50:23.632 UTC [91] DETAIL: \u0000 cannot be converted to text.
app-db-1 | 2023-12-14 00:50:23.632 UTC [91] CONTEXT: JSON data, line 1: {"auditingpolicy":...
app-db-1 | 2023-12-14 00:50:23.632 UTC [91] STATEMENT: INSERT INTO "asset_group_collection_entries"
app-db-1 | ("asset_group_collection_id","object_id","node_label","properties","created_at","updated_at")
app-db-1 | (SELECT * FROM unnest($1::bigint[], $2::text[], $3::text[], $4::jsonb[], $5::timestamp[], $5::timestamp[]));
The referenced JSON data (from *_domains.json): ... "auditingpolicy":"\u0000\u0001" ...
slokie-so
added
ticketed
(automation only) Ticket has been created internally for tracking
and removed
triage
This issue requires triaging
labels
Mar 26, 2024
Description:
If the JSON dataset to ingest contains \u0000 unicode char, the ingestion process fails with a PostgreSQL error. Encountered this when tried to upload data collected with SharpHound flag "collectallproperties" enabled.
Component(s) Affected:
Steps to Reproduce:
Expected Behavior:
It is expected to upload and ingest data successfully even if it contains special unicode chars.
Actual Behavior:
Currently the \u0000 unicode sequence implies an ingestion error.
Screenshots/Code Snippets/Sample Files:
Here is the error message in the logs:
The referenced JSON data (from *_domains.json):
... "auditingpolicy":"\u0000\u0001" ...
Environment Information:
BloodHound:
Collector:
OS:
Database (if persistence related): Neo4j 4.4 / PostgreSQL 13.2
Docker (if using Docker):
Contributor Checklist:
The text was updated successfully, but these errors were encountered: