Is there a Java annotation for errors?

I am looking for a standard Java annotation that allows me to mark some code as “associated with PROJECT-312 error”. The goal is to be able to create a report where I can see which parts of the code have been modified or affected by the error. This, for example, allows you to see the "hot spots" where a lot of errors accumulate. Or it will facilitate the transition from IDE to JIRA / BugZilla to find out what kind of error it is.

Is there a standard annotation that I can / should use or do I need to write on my own?

PS: I know Mylyn / Tasktop , which will track for me. For my purposes, these tools are too destructive right now, because they significantly change how people should work every day.

+6
source share
2 answers

Oracle approach

The Java API specification should contain sufficient statements to include Software Quality Assurance to write full Java Kit Compatibility (JCK).

This means that document comments should meet the needs of SQA compliance testing. Comments should not document errors or how an implementation that currently leaves the specification works.

From the official official JavaDoc guides :

Code errors are errors in the implementation, and not in the API Specification. Code errors and their workarounds are often similar to those distributed separately in the error report. However, if the Javadoc tool is used to create documentation for a specific implementation, it would be very useful to include this information in the comments on the dock, appropriately separated as a note or user tag (say @bug).

So basically it means that you are not confusing documentation with error messages. Use and analyze a special special tag in the comments, you really do not need more than for a successful error report.

In addition, with Eclipse Jira Connect or similar tools, you can automatically convert your @bug and TODO comments to @bug errors / errors.

Update

If you need, you can make some custom annotations. Match your needs, document and apply the whole team. Read more about it here .

 @Target({ ElementType.TYPE }) @Retention(RetentionPolicy.CLASS) // Unavailable through reflection. public @interface classbug {} // gives you the @classbug annotation. @Target({ ElementType.METHOD }) @Retention(RetentionPolicy.CLASS)// Unavailable through reflection. public @interface methodbug {} // gives you the @methodbug annotation. 
+3
source

I personally am not convinced that annotations are the right way to track hot spots. Firstly, they will be ugly in areas with a lot of errors (more @Bug lines than real code!), And for another they will uselessly fill in places that are unlikely to regress, for example, "Gradually" when some code first implemented.

In more detail, the @Bug annotation is only useful when used and used sequentially. Forcing this will be a hassle for all participants and will slow people down without providing sufficient understanding, since you have no way to find out which code affected the error and did not receive annotations.

Better, I would say, it would be to implement some kind of external analysis that looks at files that are affected by bug fixes (Commit message contains [bB][uU][gG]:? *\d+ or something like that) and analyzes this way. You can quickly check all bug fixes without adding an additional process for your developers.

Google has an interesting article: Predicting Google Errors


To your comments that annotations are more “sticky” than comments that they have a better chance of survival, I also wondered how often this difference would be useful in practice. I find this more often when comments about errors in the code are no longer informative, and stick for longer than necessary. If svn|hg|git|whatever blame in the corresponding lines there are no commits associated with the error in question, then it probably has been repeated several times, but the comment follows.

Of course, I’m not saying what you describe, it never happens, but I wonder how often this happens. If in your experience comments disappear when they can be useful, by all means, see if annotations can do better.

+2
source

Source: https://habr.com/ru/post/945561/


All Articles