<pre style='margin:0'>
Ryan Schmidt (ryandesign) pushed a commit to branch master
in repository macports-ports.

</pre>
<p><a href="https://github.com/macports/macports-ports/commit/298cc6bcf5b17583367e2779535567f262dc5726">https://github.com/macports/macports-ports/commit/298cc6bcf5b17583367e2779535567f262dc5726</a></p>
<pre style="white-space: pre; background: #F8F8F8">The following commit(s) were added to refs/heads/master by this push:
<span style='display:block; white-space:pre;color:#404040;'>     new 298cc6bcf5b py-sentence-transformers: Update to version 0.3.7
</span>298cc6bcf5b is described below

<span style='display:block; white-space:pre;color:#808000;'>commit 298cc6bcf5b17583367e2779535567f262dc5726
</span>Author: Steven Thomas Smith <s.t.smith@ieee.org>
AuthorDate: Thu Oct 1 15:22:03 2020 -0400

<span style='display:block; white-space:pre;color:#404040;'>    py-sentence-transformers: Update to version 0.3.7
</span>---
 python/py-sentence-transformers/Portfile | 46 ++++++++++++++++----------------
 1 file changed, 23 insertions(+), 23 deletions(-)

<span style='display:block; white-space:pre;color:#808080;'>diff --git a/python/py-sentence-transformers/Portfile b/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;color:#808080;'>index 35ae549449f..fa247f5fffc 100644
</span><span style='display:block; white-space:pre;background:#e0e0ff;'>--- a/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;background:#e0e0ff;'>+++ b/python/py-sentence-transformers/Portfile
</span><span style='display:block; white-space:pre;background:#e0e0e0;'>@@ -4,7 +4,7 @@ PortSystem          1.0
</span> PortGroup           python 1.0
 
 name                py-sentence-transformers
<span style='display:block; white-space:pre;background:#ffe0e0;'>-version             0.2.6.1
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+version             0.3.7
</span> revision            0
 categories          python textproc
 
<span style='display:block; white-space:pre;background:#e0e0e0;'>@@ -14,32 +14,32 @@ platforms           darwin
</span> 
 description         Sentence Embeddings with BERT & XLNet
 
<span style='display:block; white-space:pre;background:#ffe0e0;'>-long_description    BERT/XLNet produces out-of-the-box rather bad\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    sentence embeddings. This repository fine-tunes\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    BERT/XLNet with a siamese or triplet network\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    structure to produce semantically meaningful\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    sentence embeddings that can be used in\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    unsupervised scenarios: Semantic textual\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    similarity via cosine-similarity, clustering,\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    semantic search. We provide an increasing number\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    of state-of-the-art pretrained models that can be\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    used to derive sentence embeddings. See Pretrained\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    Models. Details of the implemented approaches can\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    be found in our publication: Sentence-BERT:\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    Sentence Embeddings using Siamese BERT-Networks\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    (published at EMNLP 2019). You can use this code\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    to easily train your own sentence embeddings, that\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    are tuned for your specific task. We provide\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    various dataset readers and you can tune sentence\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    embeddings with different loss function, depending\
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    on the structure of your dataset. For further\
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+long_description    BERT/XLNet produces out-of-the-box rather bad \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    sentence embeddings. This repository fine-tunes \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    BERT/XLNet with a siamese or triplet network \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    structure to produce semantically meaningful \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    sentence embeddings that can be used in \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    unsupervised scenarios: Semantic textual \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    similarity via cosine-similarity, clustering, \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    semantic search. We provide an increasing number \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    of state-of-the-art pretrained models that can be \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    used to derive sentence embeddings. See Pretrained \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    Models. Details of the implemented approaches can \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    be found in our publication: Sentence-BERT: \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    Sentence Embeddings using Siamese BERT-Networks \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    (published at EMNLP 2019). You can use this code \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    to easily train your own sentence embeddings, that \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    are tuned for your specific task. We provide \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    various dataset readers and you can tune sentence \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    embeddings with different loss function, depending \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    on the structure of your dataset. For further \
</span>                     details, see Train your own Sentence Embeddings.
 
 homepage            https://github.com/UKPLab/sentence-transformers
 
<span style='display:block; white-space:pre;background:#ffe0e0;'>-checksums           rmd160  ca6098dead338864a765f265341759e58720d87a \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    sha256  68250e1e272ad7013c879a633deca710bbaf7b8cec4080095e88904b93eed128 \
</span><span style='display:block; white-space:pre;background:#ffe0e0;'>-                    size    55609
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+checksums           rmd160  9b370dfe44747f9a7f637ecd158ee12b13464320 \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    sha256  02ac0526af66d40c0d73c5c72169a800227654c61d27a4ba938fdfec46d7197b \
</span><span style='display:block; white-space:pre;background:#e0ffe0;'>+                    size    59599
</span> 
 python.versions     37 38
 
</pre><pre style='margin:0'>

</pre>