Skip to content

Conversation

arvenil
Copy link
Contributor

@arvenil arvenil commented Jun 27, 2018

There are two issues with scrapping I would like to address:

  • mysql_exporter_scrapes_total never increases

    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 2
    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 2
    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 2
    

    Add 'collect' URL parameter to filter enabled collectors #235 (comment)

  • mysqld_exporter every request scrapes source database twice - this means double load for MySQL database itself. The scrape is done twice because Describe() also runs Collect(). Additionally, every request we register new collector so every request we call Describe() and Collect(). This is different than in other exporters where exporter is registered only once so Describe() is called only on first http request but not on subsequent. This was also true for this exporter up to this change https://github.com/prometheus/mysqld_exporter/pull/235/files#r198540277
    In our fork we fixed first issue already, but this resulted in:

    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 2
    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 4
    $ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
    mysql_exporter_scrapes_total 6
    

    Which clearly shows problem that database is now scrapped twice for every request.

This PR fixes both issues so it results in:

$ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
mysql_exporter_scrapes_total 1
$ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
mysql_exporter_scrapes_total 2
$ curl --silent localhost:9104/metrics | grep scrapes_total | grep -v '#'
mysql_exporter_scrapes_total 3

And internally database is indeed scrapped only once per http request.

Copy link
Member

@SuperQ SuperQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice, LGTM.

@SuperQ SuperQ requested a review from juliusv June 27, 2018 15:41
Signed-off-by: Kamil Dziedzic <[email protected]>
ch <- e.metrics.TotalScrapes
ch <- e.metrics.Error
e.metrics.ScrapeErrors.Collect(ch)
ch <- e.metrics.MySQLUP
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ch <- e.metrics.TotalScrapes.Desc()
ch <- e.metrics.Error.Desc()
e.metrics.ScrapeErrors.Describe(ch)
ch <- e.metrics.MySQLUP.Desc()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So we basically don't even try to register most of the metrics descs for this collector anymore? That's probably fine as it was best-effort anyway and we only have this one collector, but was surprised it wasn't explicitly mentioned.

Copy link
Contributor Author

@arvenil arvenil Jun 28, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've actually made a comment about it but looks like I forgot to press "Send" ;)

Yes. I think it doesn't make sense but your expertise here guys is highly required. To be honest I don't see when registering those descriptions could be useful, maybe if you write your own app, then you register all metrics first, they get cached, so you can just focus on sending metrics. But for standalone exporter where we explicitly say we can't determine all metrics I don't see the reason to do that. Any idea what the lack of registration of descriptions could impact? I didn't looked into implementation details so again, your knowledge here could be useful ;)

Also registering those metrics made sense as long as collector was registered once, on binary start, so first http request was calling Describe() but next request were using cached descriptions and Describe() was not called. Now we register collector every scrape, so Describe is called every http request anyway, cached descriptions are not re-used.

Initially I thought my idea was ugly hack, but I checked later and node_exporter does the same: https://github.com/prometheus/node_exporter/blob/master/collector/collector.go#L106-L110 They also in similar way register collector every http request https://github.com/prometheus/node_exporter/blob/master/node_exporter.go#L47

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, it's fine in this case.

Signed-off-by: Kamil Dziedzic <[email protected]>
@juliusv
Copy link
Member

juliusv commented Jun 28, 2018

👍

@SuperQ SuperQ merged commit 4d7577b into prometheus:master Jun 28, 2018
@arvenil
Copy link
Contributor Author

arvenil commented Jun 28, 2018

@SuperQ This closes also #245 and #269 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants