Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix measure performance frequency + Add loss log #56

Open
wants to merge 23 commits into
base: master
Choose a base branch
from

Conversation

EmanueleGhelfi
Copy link
Contributor

@EmanueleGhelfi EmanueleGhelfi commented Mar 27, 2020

Description

Fixes #58
Fixes #57
Fixes #53
Fixes #60

Fixes examples (they were broken)

Type of change

Please delete options that are not relevant.

  • Bugfix (i.e., a non-breaking change which fixes an issue)
  • New feature (i.e., a non-breaking change which adds functionality)
  • Breaking change (i.e., a fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented on my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged and published in downstream modules

emanuele added 3 commits March 27, 2020 17:20
Now executors can log their values on tensorboard.
This allows to have single components of a complex loss on TB.
Moreover, it allows logging for losses separate from metric logging.
@EmanueleGhelfi
Copy link
Contributor Author

Work in Progress

@codecov
Copy link

codecov bot commented Mar 27, 2020

Codecov Report

Attention: Patch coverage is 74.74747% with 25 lines in your changes missing coverage. Please review.

Project coverage is 87.17%. Comparing base (7ca6881) to head (6d74bff).
Report is 33 commits behind head on master.

Files with missing lines Patch % Lines
src/ashpy/trainers/gan.py 55.00% 9 Missing ⚠️
src/ashpy/losses/executor.py 72.72% 6 Missing ⚠️
src/ashpy/callbacks/classifier.py 33.33% 4 Missing ⚠️
src/ashpy/losses/gan.py 88.00% 3 Missing ⚠️
src/ashpy/trainers/classifier.py 66.66% 2 Missing ⚠️
src/ashpy/trainers/trainer.py 92.30% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master      #56      +/-   ##
==========================================
+ Coverage   86.67%   87.17%   +0.49%     
==========================================
  Files          56       56              
  Lines        2162     2206      +44     
==========================================
+ Hits         1874     1923      +49     
+ Misses        288      283       -5     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@EmanueleGhelfi EmanueleGhelfi changed the title Fix measure performance frequency + Add loss log WIP: Fix measure performance frequency + Add loss log Mar 28, 2020
@EmanueleGhelfi EmanueleGhelfi changed the title WIP: Fix measure performance frequency + Add loss log Fix measure performance frequency + Add loss log Apr 8, 2020
@mr-ubik mr-ubik added bug Something isn't working enhancement New feature or request labels Apr 8, 2020
@mr-ubik mr-ubik removed the enhancement New feature or request label Apr 8, 2020
Copy link
Contributor

@mr-ubik mr-ubik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly my OCD about type hints and docs + two clarifications

src/ashpy/callbacks/classifier.py Show resolved Hide resolved
src/ashpy/callbacks/classifier.py Outdated Show resolved Hide resolved
src/ashpy/losses/classifier.py Outdated Show resolved Hide resolved
src/ashpy/losses/executor.py Outdated Show resolved Hide resolved
src/ashpy/callbacks/classifier.py Show resolved Hide resolved
src/ashpy/losses/executor.py Show resolved Hide resolved
src/ashpy/losses/executor.py Outdated Show resolved Hide resolved
@@ -114,7 +114,9 @@ def get_discriminator_inputs(
class GeneratorAdversarialLoss(GANExecutor):
r"""Base class for the adversarial loss of the generator."""

def __init__(self, loss_fn: tf.keras.losses.Loss = None) -> None:
def __init__(
self, loss_fn: tf.keras.losses.Loss = None, name="GeneratorAdversarialLoss"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add Type Annotations to both signature and docs

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also very important: must this name argument be unique? Does it behave like the name argument for Metric and Callback?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, the loss name should be unique. We actually log the loss as ashpy/loss_name.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where can we specify the fact that the name should be unique?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the documentation + we must keep track of every loss passed to a trainer, store the names and check for duplicates. If there are duplicates, raise an exception

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add check in trainers, need to add documentation.

src/ashpy/losses/gan.py Show resolved Hide resolved
src/ashpy/metrics/classifier.py Outdated Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants