Skip to content

Commit

Permalink
undo cache
Browse files Browse the repository at this point in the history
  • Loading branch information
SpellOnYou committed Jan 5, 2022
1 parent de27a4b commit 7816666
Show file tree
Hide file tree
Showing 6 changed files with 24 additions and 24 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
_candidate
.sass-cache
.jekyll-metadata
.jekyll-cache
node_modules
Gemfile.lock

Expand Down
3 changes: 2 additions & 1 deletion Gemfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,5 @@ group :jekyll_plugins do
gem 'jekyll-sitemap'
gem 'jekyll-paginate'
gem 'jekyll-seo-tag'
end
end
gem "webrick", "~> 1.7"
13 changes: 9 additions & 4 deletions _config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,10 +54,15 @@ defaults:
layout: page

# Syntax
markdown:
kramdown:
input: GFM
syntax_highlighter: rouge
markdown: kramdown
highlighter: rouge
kramdown:
input: GFM
syntax_highlighter_opts:
default_lang: html
css_class : 'syntax'



# Paginate
paginate: 10
Expand Down
4 changes: 0 additions & 4 deletions _layouts/default.html
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,6 @@
<!-- Theme Stylesheet -->
<link rel="stylesheet" href="{{site.baseurl}}/assets/css/theme.css">

<!-- Syntax Highlither, rouge Stylesheet -->

<link href="{{site.baseurl}}/assets/css/syntax.css" rel="stylesheet" >

<!-- Jquery on header to make sure everything works, the rest of the scripts in footer for fast loading -->
<script
src="https://code.jquery.com/jquery-3.3.1.min.js"
Expand Down
27 changes: 12 additions & 15 deletions _posts/part2/2021-12-22-RF-transformer.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,18 +33,16 @@ most of the code and model architecture comes from the paper [Attention is all y

A1.

<pre>
<code class="python">
class PositionalEncoding(nn.Module):
def __init__(self, d_model:int):
super().__init__()
self.register_buffer('freq', 1/(10000 ** (torch.arange(0, d_model, 2.))))
def __call__(self, pos):
inp = torch.outer(pos, self.freq)
enc = torch.cat([inp.sin(), inp.cos()], dim=-1)
return enc
</code>
</pre>
{% highlight python %}
class PositionalEncoding(nn.Module):
def __init__(self, d_model:int):
super().__init__()
self.register_buffer('freq', 1/(10000 ** (torch.arange(0, d_model, 2.))))
def __call__(self, pos):
inp = torch.outer(pos, self.freq)
enc = torch.cat([inp.sin(), inp.cos()], dim=-1)
return enc
{% endhighlight %}

[^5]

Expand All @@ -53,7 +51,7 @@ A1.
However, as of Jan 2022, many paper reported that positional does not affect the performance. Of course it depends on specifics 😉.


```python
{% highlight python %}
positions = torch.arange(0, 100).float(); positions[:10]
d_model = 26 # original paper set this value to 512

Expand All @@ -67,8 +65,7 @@ for i in range(0,3):
ax[0].legend()
ax[0].set_xlabel("relative posotion")
ax[1].plot(res[:,int(d_model/2+i)], label=f"cos, cur pos:{i}"); ax[1].legend()

```
{% endhighlight %}

A2

Expand Down
Binary file added assets/images/favicon copy.ico
Binary file not shown.

0 comments on commit 7816666

Please sign in to comment.