Assigned
Status Update
Comments
ph...@buildertrend.com <ph...@buildertrend.com> #2
I have forwarded this request to the engineering team. We will update this issue with any progress updates and a resolution.
ji...@google.com <ji...@google.com>
[Deleted User] <[Deleted User]> #3
Hello! Sorry to bring up this issue after almost a year but I wanted to add that we have chosen metric identifier as agent.googleapis.com/memory/percent_used but autoscaling didnt work out for us either. It would be appraciated if you can guide us.
ta...@belong.co.jp <ta...@belong.co.jp> #4
Hi, at the moment we are using the cpu_utilization/target_utilization attribute (in app.yaml) for autoscaling in the app engine flexible environment, however it would be great if we can have the way to mention the memory_utilization metrics as well to decide on the auto scaling. It will give us more control of the auto scaling the instances than now.
mk...@google.com <mk...@google.com> #5
Hi, I do not see any memory metrics in neither console nor stackdriver. Is this connected to this issue?
[Deleted User] <[Deleted User]> #6
Hello Google team, I was directed to this issue by the support team when we raised concerns around the non availability of memory metrics for Auto Scaling. Is this feature 'released' or in roadmap or not considered ? Please provide some details around this
[Deleted User] <[Deleted User]> #7
Hi ,
Can somebody tell the exact metric for memory based autoscaling
Can somebody tell the exact metric for memory based autoscaling
ed...@metlife.com.ar <ed...@metlife.com.ar> #8
Does anyone know if this will be a long term fix or is there any progress on it?
mh...@splunk.com <mh...@splunk.com> #9
At the very least, it would be great if copy-logs has an option to make the logging payloads BQ conformant for use an external table. This makes me super sad this is still left unresolved.
sh...@google.com <sh...@google.com> #10
One possible workaround is to use
[
{
"name": "logName",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "resource",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "protoPayload",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "textPayload",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "timestamp",
"mode": "NULLABLE",
"type": "TIMESTAMP",
"description": null,
"fields": []
},
{
"name": "receiveTimestamp",
"mode": "NULLABLE",
"type": "TIMESTAMP",
"description": null,
"fields": []
},
{
"name": "severity",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "insertId",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "httpRequest",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "labels",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "operation",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "trace",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "spanId",
"mode": "NULLABLE",
"type": "STRING",
"description": null,
"fields": []
},
{
"name": "traceSampled",
"mode": "NULLABLE",
"type": "BOOLEAN",
"description": null,
"fields": []
},
{
"name": "sourceLocation",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
},
{
"name": "split",
"mode": "NULLABLE",
"type": "JSON",
"description": null,
"fields": []
}
]
After that, we can query the table using BQ JSON functions. For example:
SELECT
logName,
timestamp,
JSON_VALUE(protoPayload, '$.methodName') as method_name,
JSON_VALUE(protoPayload, '$.authenticationInfo.principalEmail') AS principal_email
FROM
`[project_id].[dataset].[table]`
WHERE
JSON_VALUE(protoPayload, '$.methodName') = "io.k8s.coordination.v1.leases.update"
LIMIT
100
ea...@noexternalmail.hsbc.com <ea...@noexternalmail.hsbc.com> #11
Dear team .
There's a new feature available in preview that allows you to export older logs:https://cloud.google.com/logging/docs/routing/copy-logs
But we need assistance in order to be able to visualize the logs located on a Cloud Storage bucket after being exported from a different GCP project .
There's a new feature available in preview that allows you to export older logs:
But we need assistance in order to be able to visualize the logs located on a Cloud Storage bucket after being exported from a different GCP project .
Description
We are looking for the feature to be able to export older logs.
We tried manually exporting logs,
gcloud logging read [FILTER] --format json > logs.json
then converted it to new line delimited json,
cat logs.json|jq .[] -c > logs.ndjson
and then tried to create a new table from source data = JSON (Newline Delimited) format in BigQuery UI. This doesn't work because of the "@type" structured logs won't be recognized properly with the error:
query: Invalid field name "@type". Fields must contain only letters, numbers, and underscores, start with a letter or underscore, and be at most 128 characters long.