-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New component: GitLab Receiver #35207
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I will sponsor this. |
Hi all, thank you for all the great work you've been doing in context of CI/CD pipelines and OpenTelemetry. I've been playing around with a Gitlabreceiver to create traces for CI/CD pipelines. That isn't exactly aligning with this issue, but since the Gitlabreicever was working pretty well for me, it might be interesting for more folks here in the community. I'll be giving a talk about this next week at the Kubecon. If someone is present there, please let me know and we can catch up about this topic. I would be happy to contribute (and maintain) this, if there would be an interest from the community. :) |
Hey @niwoerner , thanks for coming from LinkedIn and posting here on the issue! I think the alignment is quite nice, given the concept on the tracing end is the same. Would love to get support in contributing and maintaining this going forward. Let's sync up & figure out how we can move forward on this! |
I believe GitLab CLI can subscribe to events from running pipeline.
So maybe it is possible to receive events using some kind of (web)hook instead of the scraper.
I would appreciate the expanded use cases without redirections to GitHub that may significantly drift away from what is it possible specifically with GitLab.
What pain point do you want to address first? |
@abitrolly - the GitHub receiver is important context. Originally it was named the Upon reconsideration of the implementation & incorporating other signals (tracing, logs), we as a community decided to separate things out and rename the receiver to be This is a good summary of the decisions and context I mentioned above. |
@adrielp I am asking about specific use cases to estimate the scope of work left from https://github.com/niwoerner/gitlabreceiver. Ideally it should be a checkbox list.
|
I plan to sync with @adrielp this week at the Kubecon. (or virtually afterwards) That would probably help to determine how to proceed further in regards to the tracing functionality of my created Gitlabreceiver. I would love to hear a second opinion on a few of my implementation choices. Based on that outcome we could define next steps if (and how) we want to add the functionality to the Giltabreceiver here PS: If anyone else reading this, is attending the upcoming Kubecon and would like to exchange about this topic, please let me know :) |
@niwoerner I am not able to attend Kubecon financially, but I found your slides https://static.sched.com/hosted_files/kccncna2024/86/Kubecon-NA-2024-Gitlab-CICD-Pipelines-Otel.pdf :D For some reason page with blue stripe background are very slow to render, like page 22. Maybe it is just a problem with my Firefox viewer.
|
Regarding 1: To get the traces for pipelines with the Gitlabreceiver you need to: 1.) have a custom otel collector with the Gitlabreceiver running somewhere 2.) Configure the group/repo webhook settings to send pipeline events to the collector. Regarding 2: Do you mean the notification slide? The notification is received in Microsoft teams and sent from Dynatrace Feel free to open a discussion in the Gitlabreceiver repo or send me a message via slack/linkedin if you have more questions about the presentation :) (I don't want to abuse this issue for it :D) |
Sure, but I don't think it hurts to discuss how to test current state of things with GitLab tracing in the most easy way here. ) |
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping |
#### Description This PR adds the structure and trace skeleton for a new and already accepted Gitlabreceiver. (thanks @atoulme for sponsoring this!) The Gitlabreceiver aligns very closely with the Githubreceiver and this PR mostly mirrors the change from this PR: #36632 I'm working together with @adrielp on building out the Gitlabreceiver. More PRs to introduce metrics and actual tracing functionality are about to follow with subsequent PRs. #### Link to tracking issue #35207 #### Testing Added basic tests and built the component to test that the health check endpoint, when tracing is enabled, operates correctly. #### Documentation Docs how to configure the Gitlabreceiver via webhooks have been added. While the Gitlabreceiver can be configured after this PR, it will not actually do anything since it is under development and just the skeleton PR.
@adrielp @niwoerner I annoyingly am planning to revert the skeleton PR because it seems to have caused problems with our CI. I can't explain why, only that the issue coincides with commit to add the skeleton and subsides when the commit is reverted. See https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/workflows/build-and-test.yml?query=branch%3Amain and #37177. After the revert please resubmit a PR with the skeleton and we can work together to figure out why it is causing an issue. |
It seems like I was wrong about the cause. No revert happening. |
) #### Description This PR adds the structure and trace skeleton for a new and already accepted Gitlabreceiver. (thanks @atoulme for sponsoring this!) The Gitlabreceiver aligns very closely with the Githubreceiver and this PR mostly mirrors the change from this PR: open-telemetry#36632 I'm working together with @adrielp on building out the Gitlabreceiver. More PRs to introduce metrics and actual tracing functionality are about to follow with subsequent PRs. #### Link to tracking issue open-telemetry#35207 #### Testing Added basic tests and built the component to test that the health check endpoint, when tracing is enabled, operates correctly. #### Documentation Docs how to configure the Gitlabreceiver via webhooks have been added. While the Gitlabreceiver can be configured after this PR, it will not actually do anything since it is under development and just the skeleton PR.
Any news on the current state? I really would appreciate this new component 👍 |
I plan to continue with the implementation next week, so if everything goes as planned there should be a new PR within the next weeks :) |
GitLab folks developed pipeline analyzer https://gitlab.com/gitlab-org/gitlab/-/issues/508903 Maybe they could adapt intermediate data format to be Open Telementry. But the wishlist want to visualize spans as soon as they appear in realtime, which Open Telemetry supports not. |
@abitrolly - There's a lot of things going on. And if you don't want to wait for this contribution to be built into the OpenTelemetry collector, you can build your own and get traces today! That's one of the awesome things about OpenTelemetry and the collector, anyone is free to build their own distribution using ocb and pointing to @niwoerner 's original implementation of the gitlab receiver. It won't be identical, but it'll be close and provide immediate value. The span times will appear in close to real time (solely latency from the originating event -> conversion -> sending to a backend which is minimal) and the actual span times as they appear in the backend map to the actual times the events occurred with no added latency. On the GitLab side, there's been conversations for years around how to approach and solve this issue. See: |
) #### Description This PR adds the structure and trace skeleton for a new and already accepted Gitlabreceiver. (thanks @atoulme for sponsoring this!) The Gitlabreceiver aligns very closely with the Githubreceiver and this PR mostly mirrors the change from this PR: open-telemetry#36632 I'm working together with @adrielp on building out the Gitlabreceiver. More PRs to introduce metrics and actual tracing functionality are about to follow with subsequent PRs. #### Link to tracking issue open-telemetry#35207 #### Testing Added basic tests and built the component to test that the health check endpoint, when tracing is enabled, operates correctly. #### Documentation Docs how to configure the Gitlabreceiver via webhooks have been added. While the Gitlabreceiver can be configured after this PR, it will not actually do anything since it is under development and just the skeleton PR.
Today, I tested this implementation of the GitLab receiver in PR #39123 against our non-production instance of GitLab and it worked. I added some code from @niwoerner's implementation to fill the span attributes and in the end this looks a lot like what we need. We have just shutdown GCPE for pipeline metrics because of the load on the API. We are in need of a new solution. So, we'd be happy to support this, e.g. by testing and providing feedback. What else can we do to help? |
#### Description This PR adds tracing functonality for the gitlabreceiver. It's only possible to create deterministic Trace/SpanIDs once a pipeline is finished. Correlation with manual instrumented jobs during pipeline runtime is not possible because of limitations in Gitlab. More details can be found [here](open-telemetry/semantic-conventions#1749 (comment)) (I would like to become an OpenTelemetry member to be a codeowner of the gitlabreceiver - would someone be willing to sponsor the membership for me? - Thank you! :) ) #### Link to tracking issue #35207 #### Testing Added unit tests and performed manual testing #### Documentation Updated README how to use the receiver
#### Description This PR adds tracing functonality for the gitlabreceiver. It's only possible to create deterministic Trace/SpanIDs once a pipeline is finished. Correlation with manual instrumented jobs during pipeline runtime is not possible because of limitations in Gitlab. More details can be found [here](open-telemetry/semantic-conventions#1749 (comment)) (I would like to become an OpenTelemetry member to be a codeowner of the gitlabreceiver - would someone be willing to sponsor the membership for me? - Thank you! :) ) #### Link to tracking issue open-telemetry#35207 #### Testing Added unit tests and performed manual testing #### Documentation Updated README how to use the receiver
Uh oh!
There was an error while loading. Please reload this page.
The purpose and use-cases of the new component
The purpose of this receiver is the same as the GitHub Receiver but in the context of GItLab. It will support metrics, logs, and traces from GitLab in almost the same way as the GitLab receiver. Much of this code exists in some form or another.
Really, the logic and metrics are the same as the GitHub receiver. The differences are:
Everything else is closely aligned. The CICD SIG will leverage this as one of the prototypes alongside the GitHub receiver helping to inform conventions and support telemetry in GitLab.
Example configuration for the component
Telemetry data types supported
Metrics, Logs, Traces
Is this a vendor-specific component?
Code Owner(s)
adrielp
Sponsor (optional)
@atoulme
Additional context
No response
The text was updated successfully, but these errors were encountered: