Log in
Enquire now
‌

US Patent 10565318 Neural machine translation with latent tree attention

Patent 10565318 was granted and assigned to Salesforce.com, Inc. on February, 2020 by the United States Patent and Trademark Office.

OverviewStructured DataIssuesContributors

Contents

Is a
Patent
Patent

Patent attributes

Patent Applicant
Current Assignee
Patent Jurisdiction
United States Patent and Trademark Office
United States Patent and Trademark Office
Patent Number
10565318
Date of Patent
February 18, 2020
Patent Application Number
15901722
Date Filed
February 21, 2018
Patent Citations
‌
US Patent 10346721 Training a neural network using augmented training datasets
‌
US Patent 10282663 Three-dimensional (3D) convolution with 3D batch normalization
Patent Citations Received
‌
US Patent 12086539 System and method for natural language processing using neural network with cross-task training
0
‌
US Patent 11487999 Spatial-temporal reasoning through pretrained language models for video-grounded dialogues
‌
US Patent 11501076 Multitask learning as question answering
‌
US Patent 11934952 Systems and methods for natural language processing using joint energy-based models
0
‌
US Patent 11948665 Systems and methods for language modeling of protein engineering
0
‌
US Patent 11487939 Systems and methods for unsupervised autoregressive text compression
‌
US Patent 11537801 Structured text translation
‌
US Patent 10776581 Multitask learning as question answering
...
Patent Primary Examiner
‌
Thomas H Maung
Patent abstract

We introduce an attentional neural machine translation model for the task of machine translation that accomplishes the longstanding goal of natural language processing to take advantage of the hierarchical structure of language without a priori annotation. The model comprises a recurrent neural network grammar (RNNG) encoder with a novel attentional RNNG decoder and applies policy gradient reinforcement learning to induce unsupervised tree structures on both the source sequence and target sequence. When trained on character-level datasets with no explicit segmentation or parse annotation, the model learns a plausible segmentation and shallow parse, obtaining performance close to an attentional baseline.

Timeline

No Timeline data yet.

Further Resources

Title
Author
Link
Type
Date
No Further Resources data yet.

References

Find more entities like US Patent 10565318 Neural machine translation with latent tree attention

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.