Quantcast

Sonar performance

classic Classic list List threaded Threaded
14 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Sonar performance

jelmer
Hi,

We're looking at using Sonar for a relatively large number of projects, some of which are of significant size (hundred thousands of lines of code and up) and we are in the process of debating whether we should give each each project their own sonar instance or if it's possible to use a single sonar instance for all projects. Obviously there are pros and cons to everything. But since we probably also want to deploy some commercial plugins (developer cockpit etc) that are priced per instance it's obviously more cost efficient to use a single sonar instance for all projects, Also this would allow us to run cross project copy and paste detection etc.

However one of the concerns that where raised was that performance might not be acceptable in particular if we adopt the notion of continuous inspection and build pipelines where sonar metrics are being generated for virtually every checkin. 

Is there any information available on the performance of sonar ?


Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

Patroklos Papapetrou
Hi jelmer

Well there is a page  http://docs.codehaus.org/display/SONAR/Performances  that gives you some tips mostly in DB performance.
From my experience performance of Sonar is dramatically increased if DB and Sonar rely on the same machine. Furthermore it depends on what kind of plugins you have installed, if you have enabled cross project duplications ( slightly decreases performance but I strongly recommend to enable it) the number of projects ( and of course their size ) and the frequency you are analysing them... 

Replying to your rhetorical (I assume) question about having one Sonar instance per project, for me it's unacceptable. Very hard to maintain and, you will reserve memory and other resources for each instance wihtout actually using them. So for me one Sonar installation is fine and then you have to tune it to fit your demands. 
Sonar's demo ( http://nemo.sonarsource.org ) hosts millions of source code lines and hundreds of projects with no particular problems. 



2012/7/5 jelmer <[hidden email]>
Hi,

We're looking at using Sonar for a relatively large number of projects, some of which are of significant size (hundred thousands of lines of code and up) and we are in the process of debating whether we should give each each project their own sonar instance or if it's possible to use a single sonar instance for all projects. Obviously there are pros and cons to everything. But since we probably also want to deploy some commercial plugins (developer cockpit etc) that are priced per instance it's obviously more cost efficient to use a single sonar instance for all projects, Also this would allow us to run cross project copy and paste detection etc.

However one of the concerns that where raised was that performance might not be acceptable in particular if we adopt the notion of continuous inspection and build pipelines where sonar metrics are being generated for virtually every checkin. 

Is there any information available on the performance of sonar ?



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

RE: Sonar performance

Laurent Malvert-2
In reply to this post by jelmer
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

Ann Campbell
For the record Laurent, after a configurable period (default 24Hr) Sonar only retains one snapshot per day. But within that initial period all snapshots are retained. 


On Thu, Jul 5, 2012 at 7:40 AM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.



--
G. Ann Campbell
Sr. Systems Engineer, IS Production Systems - Shop Floor Systems
Shaw Industries Inc,
201 S. Hamilton St.
Dalton Ga 30720


**********************************************************
Privileged and/or confidential information may be contained in this message. If you are not the addressee indicated in this message (or are not responsible for delivery of this message to that person) , you may not copy or deliver this message to anyone. In such case, you should destroy this message and notify the sender by reply e-mail.
If you or your employer do not consent to Internet e-mail for messages of this kind, please advise the sender.
Shaw Industries does not provide or endorse any opinions, conclusions or other information in this message that do not relate to the official business of the company  or its subsidiaries.
**********************************************************

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

jelmer
In reply to this post by Patroklos Papapetrou
Hi Patroklos,

Thank you for taking the time to reply. I had already seen that wiki page you liked. It mostly covers how to improve the performance of Sonar but does not show any benchmarks or anything similar, which is really what i am after. 

Also since we are looking at doing sonar analysis "continuously" we're also interested in "write performance" I've seen Nemo and i know that the interface supports many lines of code. However I am assuming that these projects are imported once a day at most. If we choose to go down the "continuous delivery route" we potentially analyse projects many time a day. Which I imagine would be costly. I am interested in finding out just how costly this is and if anyone has any experience with such a setup.

I assure you that my question about a sonar instance per project was not rhetorical. I agree with you that this will make make maintenance more difficult and costly and that you are not making full use of your machines resources but by using virtualization and tools like puppet or chef, you can take some of that pain away.


On Thu, Jul 5, 2012 at 1:13 PM, Papapetrou P.Patroklos <[hidden email]> wrote:
Hi jelmer

Well there is a page  http://docs.codehaus.org/display/SONAR/Performances  that gives you some tips mostly in DB performance.
From my experience performance of Sonar is dramatically increased if DB and Sonar rely on the same machine. Furthermore it depends on what kind of plugins you have installed, if you have enabled cross project duplications ( slightly decreases performance but I strongly recommend to enable it) the number of projects ( and of course their size ) and the frequency you are analysing them... 

Replying to your rhetorical (I assume) question about having one Sonar instance per project, for me it's unacceptable. Very hard to maintain and, you will reserve memory and other resources for each instance wihtout actually using them. So for me one Sonar installation is fine and then you have to tune it to fit your demands. 
Sonar's demo ( http://nemo.sonarsource.org ) hosts millions of source code lines and hundreds of projects with no particular problems. 



2012/7/5 jelmer <[hidden email]>
Hi,

We're looking at using Sonar for a relatively large number of projects, some of which are of significant size (hundred thousands of lines of code and up) and we are in the process of debating whether we should give each each project their own sonar instance or if it's possible to use a single sonar instance for all projects. Obviously there are pros and cons to everything. But since we probably also want to deploy some commercial plugins (developer cockpit etc) that are priced per instance it's obviously more cost efficient to use a single sonar instance for all projects, Also this would allow us to run cross project copy and paste detection etc.

However one of the concerns that where raised was that performance might not be acceptable in particular if we adopt the notion of continuous inspection and build pipelines where sonar metrics are being generated for virtually every checkin. 

Is there any information available on the performance of sonar ?




Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

Fabrice Bellingard-4
In reply to this post by Laurent Malvert-2
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

David Racodon-2
In reply to this post by jelmer
Hi,

As an example, we work with a company that analyses thousands of projects every night for a total of 500+ millions lines of code and this on a single Sonar instance.
Then, if I were you, I would start with one single instance that should keep suiting you for a long time.

Regards,


David RACODON | SonarSource
Senior Consultant



On 5 July 2012 14:21, jelmer <[hidden email]> wrote:
Hi Patroklos,

Thank you for taking the time to reply. I had already seen that wiki page you liked. It mostly covers how to improve the performance of Sonar but does not show any benchmarks or anything similar, which is really what i am after. 

Also since we are looking at doing sonar analysis "continuously" we're also interested in "write performance" I've seen Nemo and i know that the interface supports many lines of code. However I am assuming that these projects are imported once a day at most. If we choose to go down the "continuous delivery route" we potentially analyse projects many time a day. Which I imagine would be costly. I am interested in finding out just how costly this is and if anyone has any experience with such a setup.

I assure you that my question about a sonar instance per project was not rhetorical. I agree with you that this will make make maintenance more difficult and costly and that you are not making full use of your machines resources but by using virtualization and tools like puppet or chef, you can take some of that pain away.


On Thu, Jul 5, 2012 at 1:13 PM, Papapetrou P.Patroklos <[hidden email]> wrote:
Hi jelmer

Well there is a page  http://docs.codehaus.org/display/SONAR/Performances  that gives you some tips mostly in DB performance.
From my experience performance of Sonar is dramatically increased if DB and Sonar rely on the same machine. Furthermore it depends on what kind of plugins you have installed, if you have enabled cross project duplications ( slightly decreases performance but I strongly recommend to enable it) the number of projects ( and of course their size ) and the frequency you are analysing them... 

Replying to your rhetorical (I assume) question about having one Sonar instance per project, for me it's unacceptable. Very hard to maintain and, you will reserve memory and other resources for each instance wihtout actually using them. So for me one Sonar installation is fine and then you have to tune it to fit your demands. 
Sonar's demo ( http://nemo.sonarsource.org ) hosts millions of source code lines and hundreds of projects with no particular problems. 



2012/7/5 jelmer <[hidden email]>
Hi,

We're looking at using Sonar for a relatively large number of projects, some of which are of significant size (hundred thousands of lines of code and up) and we are in the process of debating whether we should give each each project their own sonar instance or if it's possible to use a single sonar instance for all projects. Obviously there are pros and cons to everything. But since we probably also want to deploy some commercial plugins (developer cockpit etc) that are priced per instance it's obviously more cost efficient to use a single sonar instance for all projects, Also this would allow us to run cross project copy and paste detection etc.

However one of the concerns that where raised was that performance might not be acceptable in particular if we adopt the notion of continuous inspection and build pipelines where sonar metrics are being generated for virtually every checkin. 

Is there any information available on the performance of sonar ?





Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

jelmer
In reply to this post by Fabrice Bellingard-4
Hi Fabrice,

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.
I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

Personally I do see the value that such an approach offers. 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

Is really nobody doing this right now? And if so why not ?


On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.


Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

David Racodon-2
Hi,

For your use case, If your development teams use Eclipse, they could run local analyses using Sonar in Eclipse:  http://docs.codehaus.org/display/SONAR/Using+Sonar+in+Eclipse 

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 15:47, jelmer <[hidden email]> wrote:
Hi Fabrice,

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.
I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

Personally I do see the value that such an approach offers. 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

Is really nobody doing this right now? And if so why not ?


On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

jelmer
Hi David,

Thanks again for taking the time to respond. It's nice to see credible people like book authors and sonarsource employees being so responsive on this list

I agree that using the eclipse plugin gives you some of the same benefits but do you advice against running sonar from a build pipeline and if so, why ?

On Thu, Jul 5, 2012 at 3:55 PM, David Racodon <[hidden email]> wrote:
Hi,

For your use case, If your development teams use Eclipse, they could run local analyses using Sonar in Eclipse:  http://docs.codehaus.org/display/SONAR/Using+Sonar+in+Eclipse 

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 15:47, jelmer <[hidden email]> wrote:
Hi Fabrice,

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.
I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

Personally I do see the value that such an approach offers. 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

Is really nobody doing this right now? And if so why not ?


On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.




Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

David Racodon-2
Hi,

As Fabrice and Laurent said, I would advice to run a Sonar analysis once a day (usually at night). You can also let the ability to the development teams to manually trigger a Sonar analysis during the day if they feel like for a specific reason.

From my point of view, running a Sonar analysis after every single check-in is useless. A quite simple reason is that the developers won't wait for the Sonar analysis to finish after their check-in. And there are many more reasons I won't list here.

If you thought of running a Sonar analysis after every single check-in to break the build anytime the check-in adds some violations, forget it right away. For example: what if Findbugs logs a violation that is actually a false-positive? And there are many more reasons as well, I won't list here.

I'd be glad to keep discussing those matters with you.
If you feel like it, switch to sonarsource.com, have a look at our consultancy offers and contact us from there.

Thank you

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 16:22, jelmer <[hidden email]> wrote:
Hi David,

Thanks again for taking the time to respond. It's nice to see credible people like book authors and sonarsource employees being so responsive on this list

I agree that using the eclipse plugin gives you some of the same benefits but do you advice against running sonar from a build pipeline and if so, why ?

On Thu, Jul 5, 2012 at 3:55 PM, David Racodon <[hidden email]> wrote:
Hi,

For your use case, If your development teams use Eclipse, they could run local analyses using Sonar in Eclipse:  http://docs.codehaus.org/display/SONAR/Using+Sonar+in+Eclipse 

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 15:47, jelmer <[hidden email]> wrote:
Hi Fabrice,

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.
I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

Personally I do see the value that such an approach offers. 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

Is really nobody doing this right now? And if so why not ?


On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.





Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Sonar performance

jelmer
Personally i'd rather have sonar reports available after a few minutes than the next day. But I agree that developers won't wait around for the results (nor should they). The same thing goes for functional tests. The idea of continuous delivery is is getting feedback faster.

As for breaking the build every time a violation is added, I agree that would not be a good idea. But the build breaker plugin is triggered by alerts that you can configure yourself and you can configure alerts on much more than just violations. Some people might want to break the build if unit test coverage drops below a certain percentage for instance if they are required by contract to deliver software that has at least this amount of coverage. I think that is a valid usecase.

But i'll let it rest. If someone else has any experience with this i'd love to hear about it, and in particular if performance was an issue

On Thu, Jul 5, 2012 at 5:04 PM, David Racodon <[hidden email]> wrote:
Hi,

As Fabrice and Laurent said, I would advice to run a Sonar analysis once a day (usually at night). You can also let the ability to the development teams to manually trigger a Sonar analysis during the day if they feel like for a specific reason.

From my point of view, running a Sonar analysis after every single check-in is useless. A quite simple reason is that the developers won't wait for the Sonar analysis to finish after their check-in. And there are many more reasons I won't list here.

If you thought of running a Sonar analysis after every single check-in to break the build anytime the check-in adds some violations, forget it right away. For example: what if Findbugs logs a violation that is actually a false-positive? And there are many more reasons as well, I won't list here.

I'd be glad to keep discussing those matters with you.
If you feel like it, switch to sonarsource.com, have a look at our consultancy offers and contact us from there.

Thank you

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 16:22, jelmer <[hidden email]> wrote:
Hi David,

Thanks again for taking the time to respond. It's nice to see credible people like book authors and sonarsource employees being so responsive on this list

I agree that using the eclipse plugin gives you some of the same benefits but do you advice against running sonar from a build pipeline and if so, why ?

On Thu, Jul 5, 2012 at 3:55 PM, David Racodon <[hidden email]> wrote:
Hi,

For your use case, If your development teams use Eclipse, they could run local analyses using Sonar in Eclipse:  http://docs.codehaus.org/display/SONAR/Using+Sonar+in+Eclipse 

Regards,

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 15:47, jelmer <[hidden email]> wrote:
Hi Fabrice,

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.
I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

Personally I do see the value that such an approach offers. 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

Is really nobody doing this right now? And if so why not ?


On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:
On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:
> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]
> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).
FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 
I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.






Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

RE: Sonar performance

Laurent Malvert-2
> From: jelmer [mailto:[hidden email]]
>
>Personally i'd rather have sonar reports available after a few minutes
> than the next day. [...]

Except that won't be the case either, especially for large applications.
We have a build on a multi-module application with ~500K Java LOC.

Running a full-build (clean, build, unit + integration tests) takes about
21 to 25 minutes on our single-node CI server (and our test coverage
is not so great, so it should grow to be longer).

Our daily Sonar tasks takes more than twice as much time: between
1h and 1h10min. And of course that depends a lot on how many new
Violations are part of checkins (lately, we mostly remove a
Lot of them, but I've seen a few spikes up to 1h40 min for the
same build, apparently because something trigger some rules
to take longer than usual).

And we pretty much run only classic CheckStyle + PMD + FindBugs rules,
and the internal Sonar rules (it's a 3.0 version at the moment).


So, as you mentioned large projects, you can see that there
really would be no way to get results after "a few minutes".
I can understand that you want them "as early" as possible,
but then you'd need to settle for some intervals.


Hope this convinces you.
If it doesn't, I'd recommend you give it a shot, as you won't really
know until you try. Maybe in your context it will take a satisfying
amount of time and you'll be able to do something close to what you
want, or realize that it doesn't work.

I myself would be curious to hear if you manage to achieve your pipeline
and provide instant reporting.

(sorry about the lengthy copyright notice in the sig., our mail server
adds that... their fault, not mine ;)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future






The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

RE: Sonar performance

jpfayolle
In reply to this post by jelmer

Hi Jelmer,

 

Actually, when it comes to performance, I would recommend a specific attention to the architecture.

Database and disk usage are 2 main points. Some points :

. Some database performs better or have more regular performance. In my experience, Oracle and SqlServer have both good performance but SqlServer performance decreases faster as it takes more resources (memory especially).

. Having a dedicated database server, if possible physical. I have seen great variations of performance on virtual machines.

. Having a dedicated database instance instead of a schema into an existing instance.

. Optimizing the parameters on the DB.

. Regular check of the disk usage. I have seen >100% performance improvement just by cleaning the disk = backuping what is not needed and defragmenting the disk. Over 50% of disk usage, the performance decreases. Having disk with good performance also helps.

 

I believe this question of performance is not only for Sonar, because you have other tools/tasks in your process. So optimizing your architecture and defining the right procedures (backup, etc…) might benefit to all the tools and tasks.

 

It might not be pertinent but a test I did recently, on a Cobol analysis.

Application of 1387 Cobol programs, around 250 000 total lines of which 185 000 lines of code. I changed a file (- 2 violations) and analysis took 15 mn on my laptop, so not the quickest machine. Although I check regularly my hard disk J.

 

Do not hesitate to ask for any precision.

Regards.

 

Jean-Pierre FAYOLLE

www.qualilogy.com

 

 

 

From: jelmer [mailto:[hidden email]]
Sent: jueves, 05 de julio de 2012 17:56
To: [hidden email]
Subject: Re: [sonar-user] Sonar performance

 

Personally i'd rather have sonar reports available after a few minutes than the next day. But I agree that developers won't wait around for the results (nor should they). The same thing goes for functional tests. The idea of continuous delivery is is getting feedback faster.

 

As for breaking the build every time a violation is added, I agree that would not be a good idea. But the build breaker plugin is triggered by alerts that you can configure yourself and you can configure alerts on much more than just violations. Some people might want to break the build if unit test coverage drops below a certain percentage for instance if they are required by contract to deliver software that has at least this amount of coverage. I think that is a valid usecase.

 

But i'll let it rest. If someone else has any experience with this i'd love to hear about it, and in particular if performance was an issue

 

On Thu, Jul 5, 2012 at 5:04 PM, David Racodon <[hidden email]> wrote:

Hi,

 

As Fabrice and Laurent said, I would advice to run a Sonar analysis once a day (usually at night). You can also let the ability to the development teams to manually trigger a Sonar analysis during the day if they feel like for a specific reason.

 

From my point of view, running a Sonar analysis after every single check-in is useless. A quite simple reason is that the developers won't wait for the Sonar analysis to finish after their check-in. And there are many more reasons I won't list here.

 

If you thought of running a Sonar analysis after every single check-in to break the build anytime the check-in adds some violations, forget it right away. For example: what if Findbugs logs a violation that is actually a false-positive? And there are many more reasons as well, I won't list here.

 

I'd be glad to keep discussing those matters with you.

If you feel like it, switch to sonarsource.com, have a look at our consultancy offers and contact us from there.

 

Thank you

Regards,

 

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 16:22, jelmer <[hidden email]> wrote:

Hi David,

 

Thanks again for taking the time to respond. It's nice to see credible people like book authors and sonarsource employees being so responsive on this list

 

I agree that using the eclipse plugin gives you some of the same benefits but do you advice against running sonar from a build pipeline and if so, why ?

 

On Thu, Jul 5, 2012 at 3:55 PM, David Racodon <[hidden email]> wrote:

Hi,

 

For your use case, If your development teams use Eclipse, they could run local analyses using Sonar in Eclipse:  http://docs.codehaus.org/display/SONAR/Using+Sonar+in+Eclipse 

 

Regards,

 

David RACODON | SonarSource
Senior Consultant



On 5 July 2012 15:47, jelmer <[hidden email]> wrote:

Hi Fabrice,

 

Indeed that has been what we have been using on other projects in the past.  We'd basically have a build that would run sonar every night as well as functional tests and anything else that would be too slow to run in the continuous build.

 

However recently there has been much to do about "continuous delivery" a practice popularized by the book with the same name that suggests running all steps of the build process all the time in a "pipeline" of sorts.

I am sure you are familiar with it, but in case you are not there's an excerpt  of the book that explains the concept available at informit

 

Personally I do see the value that such an approach offers. 

 

As a developer I would really appreciate the fact that I would be able to see any violations that PMD or findbugs flagged straight away.

 

And if a team configures the build-breaker plugin then I am sure that that team will want to know that the build is broken straight away rather than on the morning of "release day"

 

Is really nobody doing this right now? And if so why not ?

 

On Thu, Jul 5, 2012 at 2:36 PM, Fabrice Bellingard <[hidden email]> wrote:

On Thu, Jul 5, 2012 at 1:40 PM, Laurent Malvert <[hidden email]> wrote:

> From: Papapetrou P.Patroklos [mailto:[hidden email]]
> [...]

> However one of the concerns that where raised was that performance might
> not be acceptable in particular if we adopt the notion of continuous
> inspection and build pipelines where sonar metrics are being generated
> for virtually every checkin.

There isn't much point in doing this.

You're far better off having your continuous integration kick in for every check-in, but continuous inspection running ONLY on successful builds AND only once per day at most.

 

Indeed, there's little interest in running a full analysis for every commit (whereas this is the case for compiling and running tests).

FYI guys, what Laurent suggests is exactly what we currently do here at SonarSource, and it's working great.

 

 

I doubt you need a finer granularity (and I remember reading that Sonar records changes only for 24hours intervals)

--
Laurent Malvert | Senior Software Developer | IDBS
www.idbs.com

Powering Science for a Better Future

The information contained in this email may contain confidential or legally privileged information. If you are not the intended recipient any disclosure, copying, distribution or taking any action on the contents of this information may be unlawful. If you have received this email in error, please delete it from your system and notify us immediately. Any views expressed in this message are those of the individual sender, except where the message states otherwise. IDBS takes no responsibility for any computer virus which might be transferred by way of this email and recommends that you subject any incoming E-mail to your own virus checking procedures. We may monitor all E-mail communication through our networks. If you contact us by E-mail, we may store your name and address to facilitate communication.

 

 

 

 

 

 

Loading...