Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

TensorFlow

From Wikipedia, the free encyclopedia
Machine learning software library

TensorFlow
TensorFlow logo
Developer(s)Google Brain Team[1]
Initial releaseNovember 9, 2015; 9 years ago (2015-11-09)
Stable release
2.18.0 / October 25, 2024; 4 months ago (2024-10-25)
Repositorygithub.com/tensorflow/tensorflow
Written inPython,C++,CUDA
PlatformLinux,macOS,Windows,Android,JavaScript[2]
TypeMachine learninglibrary
LicenseApache 2.0
Websitetensorflow.org
Part of a series on
Machine learning
anddata mining
Journals and conferences

TensorFlow is asoftware library formachine learning andartificial intelligence. It can be used across a range of tasks, but is used mainly fortraining andinference ofneural networks.[3][4] It is one of the most populardeep learning frameworks, alongside others such asPyTorch and PaddlePaddle.[5][6] It isfree and open-source software released under theApache License 2.0.

It was developed by theGoogle Brain team forGoogle's internal use in research and production.[7][8][9] The initial version was released under theApache License 2.0 in 2015.[1][10] Google released an updated version, TensorFlow 2.0, in September 2019.[11]

TensorFlow can be used in a wide variety of programming languages, includingPython,JavaScript,C++, andJava,[12] facilitating its use in a range of applications in many sectors.

History

[edit]

DistBelief

[edit]

Starting in 2011, Google Brain built DistBelief as aproprietarymachine learning system based ondeep learningneural networks. Its use grew rapidly across diverseAlphabet companies in both research and commercial applications.[13][14] Google assigned multiple computer scientists, includingJeff Dean, to simplify andrefactor the codebase of DistBelief into a faster, more robust application-grade library, which became TensorFlow.[15] In 2009, the team, led byGeoffrey Hinton, had implemented generalizedbackpropagation and other improvements, which allowed generation ofneural networks with substantially higher accuracy, for instance a 25% reduction in errors inspeech recognition.[16]

TensorFlow

[edit]

TensorFlow is Google Brain's second-generation system. Version 1.0.0 was released on February 11, 2017.[17] While thereference implementation runs on single devices, TensorFlow can run on multipleCPUs andGPUs (with optionalCUDA andSYCL extensions forgeneral-purpose computing on graphics processing units).[18] TensorFlow is available on 64-bitLinux,macOS,Windows, and mobile computing platforms includingAndroid andiOS.[citation needed]

Its flexible architecture allows for easy deployment of computation across a variety of platforms (CPUs, GPUs,TPUs), and from desktops to clusters of servers to mobile andedge devices.

TensorFlow computations are expressed asstatefuldataflowgraphs. The name TensorFlow derives from the operations that such neural networks perform on multidimensional data arrays, which are referred to astensors.[19] During theGoogle I/O Conference in June 2016, Jeff Dean stated that 1,500 repositories onGitHub mentioned TensorFlow, of which only 5 were from Google.[20]

In March 2018, Google announced TensorFlow.js version 1.0 for machine learning inJavaScript.[21]

In Jan 2019, Google announced TensorFlow 2.0.[22] It became officially available in September 2019.[11]

In May 2019, Google announced TensorFlow Graphics for deep learning in computer graphics.[23]

Tensor processing unit (TPU)

[edit]
Main article:Tensor processing unit

In May 2016, Google announced itsTensor processing unit (TPU), anapplication-specific integrated circuit (ASIC, a hardware chip) built specifically for machine learning and tailored for TensorFlow. A TPU is a programmableAI accelerator designed to provide highthroughput of low-precisionarithmetic (e.g.,8-bit), and oriented toward using or running models rather thantraining them. Google announced they had been running TPUs inside their data centers for more than a year, and had found them to deliver anorder of magnitude better-optimizedperformance per watt for machine learning.[24]

In May 2017, Google announced the second-generation, as well as the availability of the TPUs inGoogle Compute Engine.[25] The second-generation TPUs deliver up to 180teraflops of performance, and when organized into clusters of 64 TPUs, provide up to 11.5petaflops.[citation needed]

In May 2018, Google announced the third-generation TPUs delivering up to 420teraflops of performance and 128 GB highbandwidth memory (HBM). Cloud TPU v3 Pods offer 100+petaflops of performance and 32 TB HBM.[26]

In February 2018, Google announced that they were making TPUs available in beta on theGoogle Cloud Platform.[27]

Edge TPU

[edit]

In July 2018, the Edge TPU was announced. Edge TPU is Google's purpose-builtASIC chip designed to run TensorFlow Lite machine learning (ML) models on small client computing devices such as smartphones[28] known asedge computing.

TensorFlow Lite

[edit]

In May 2017, Google announced a software stack specifically for mobile development, TensorFlow Lite.[29] In January 2019, the TensorFlow team released a developer preview of the mobile GPU inference engine with OpenGL ES 3.1 Compute Shaders on Android devices and Metal Compute Shaders on iOS devices.[30] In May 2019, Google announced that their TensorFlow Lite Micro (also known as TensorFlow Lite for Microcontrollers) andARM's uTensor would be merging.[31]

TensorFlow 2.0

[edit]

As TensorFlow's market share among research papers was declining to the advantage ofPyTorch,[32] the TensorFlow Team announced a release of a new major version of the library in September 2019. TensorFlow 2.0 introduced many changes, the most significant being TensorFlow eager, which changed the automatic differentiation scheme from the static computational graph to the "Define-by-Run" scheme originally made popular byChainer and laterPyTorch.[32] Other major changes included removal of old libraries, cross-compatibility between trained models on different versions of TensorFlow, and significant improvements to the performance on GPU.[33]

Features

[edit]

AutoDifferentiation

[edit]

AutoDifferentiation is the process of automatically calculating the gradient vector of a model with respect to each of its parameters. With this feature, TensorFlow can automatically compute the gradients for the parameters in a model, which is useful to algorithms such asbackpropagation which require gradients to optimize performance.[34] To do so, the framework must keep track of the order of operations done to the input Tensors in a model, and then compute the gradients with respect to the appropriate parameters.[34]

Eager execution

[edit]

TensorFlow includes an “eager execution” mode, which means that operations are evaluated immediately as opposed to being added to a computational graph which is executed later.[35] Code executed eagerly can be examined step-by step-through a debugger, since data is augmented at each line of code rather than later in a computational graph.[35] This execution paradigm is considered to be easier to debug because of its step by step transparency.[35]

Distribute

[edit]

In both eager and graph executions, TensorFlow provides an API for distributing computation across multiple devices with various distribution strategies.[36] Thisdistributed computing can often speed up the execution of training and evaluating of TensorFlow models and is a common practice in the field of AI.[36][37]

Losses

[edit]

To train and assess models, TensorFlow provides a set ofloss functions (also known ascost functions).[38] Some popular examples includemean squared error (MSE) andbinary cross entropy (BCE).[38]

Metrics

[edit]

In order to assess the performance of machine learning models, TensorFlow gives API access to commonly used metrics. Examples include various accuracy metrics (binary, categorical, sparse categorical) along with other metrics such asPrecision, Recall, andIntersection-over-Union (IoU).[39]

TF.nn

[edit]

TensorFlow.nn is a module for executing primitiveneural network operations on models.[40] Some of these operations include variations ofconvolutions (1/2/3D, Atrous, depthwise),activation functions (Softmax,RELU, GELU,Sigmoid, etc.) and their variations, and other operations (max-pooling, bias-add, etc.).[40]

Optimizers

[edit]

TensorFlow offers a set of optimizers for training neural networks, includingADAM,ADAGRAD, andStochastic Gradient Descent (SGD).[41] When training a model, different optimizers offer different modes of parameter tuning, often affecting a model's convergence and performance.[42]

Usage and extensions

[edit]

TensorFlow

[edit]

TensorFlow serves as a core platform and library for machine learning. TensorFlow's APIs useKeras to allow users to make their own machine-learning models.[33][43] In addition to building and training their model, TensorFlow can also help load the data to train the model, and deploy it using TensorFlow Serving.[44]

TensorFlow provides a stablePythonApplication Program Interface (API),[45] as well as APIs without backwards compatibility guarantee forJavascript,[46]C++,[47] andJava.[48][12] Third-party language binding packages are also available forC#,[49][50]Haskell,[51]Julia,[52]MATLAB,[53]Object Pascal,[54]R,[55]Scala,[56]Rust,[57]OCaml,[58] andCrystal.[59] Bindings that are now archived and unsupported includeGo[60] andSwift.[61]

TensorFlow.js

[edit]

TensorFlow also has a library for machine learning in JavaScript. Using the providedJavaScript APIs, TensorFlow.js allows users to use either Tensorflow.js models or converted models from TensorFlow or TFLite, retrain the given models, and run on the web.[44][62]

TFLite

[edit]

TensorFlow Lite has APIs for mobile apps or embedded devices to generate and deploy TensorFlow models.[63] These models are compressed and optimized in order to be more efficient and have a higher performance on smaller capacity devices.[64]

TensorFlow Lite usesFlatBuffers as the data serialization format for network models, eschewing theProtocol Buffers format used by standard TensorFlow models.[64]

TFX

[edit]

TensorFlow Extended (abbrev. TFX) provides numerous components to perform all the operations needed for end-to-end production.[65] Components include loading, validating, and transforming data, tuning, training, and evaluating the machine learning model, and pushing the model itself into production.[44][65]

Integrations

[edit]

Numpy

[edit]

Numpy is one of the most popularPython data libraries, and TensorFlow offers integration and compatibility with its data structures.[66] Numpy NDarrays, the library's native datatype, are automatically converted to TensorFlow Tensors in TF operations; the same is also true vice versa.[66] This allows for the two libraries to work in unison without requiring the user to write explicit data conversions. Moreover, the integration extends to memory optimization by having TF Tensors share the underlying memory representations of Numpy NDarrays whenever possible.[66]

Extensions

[edit]

TensorFlow also offers a variety oflibraries andextensions to advance and extend the models and methods used.[67] For example, TensorFlow Recommenders and TensorFlow Graphics arelibraries for their respective functional.[68] Other add-ons,libraries, andframeworks include TensorFlow Model Optimization, TensorFlow Probability, TensorFlow Quantum, and TensorFlow Decision Forests.[67][68]

Google Colab

[edit]

Google also released Colaboratory, a TensorFlow Jupyter notebook environment that does not require any setup.[69] It runs on Google Cloud and allows users free access to GPUs and the ability to store and share notebooks onGoogle Drive.[70]

Google JAX

[edit]
Main article:Google JAX

Google JAX is a machine learningframework for transforming numerical functions.[71][72][73] It is described as bringing together a modified version ofautograd (automatic obtaining of the gradient function through differentiation of a function) and TensorFlow'sXLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow ofNumPy as closely as possible and works with TensorFlow as well as other frameworks such asPyTorch. The primary functions of JAX are:[71]

  1. grad: automatic differentiation
  2. jit: compilation
  3. vmap: auto-vectorization
  4. pmap: SPMD programming

Applications

[edit]

Medical

[edit]

GE Healthcare used TensorFlow to increase the speed and accuracy ofMRIs in identifying specific body parts.[74] Google used TensorFlow to create DermAssist, a free mobile application that allows users to take pictures of their skin and identify potential health complications.[75]Sinovation Ventures used TensorFlow to identify and classify eye diseases fromoptical coherence tomography (OCT) scans.[75]

Social media

[edit]

Twitter implemented TensorFlow to rank tweets by importance for a given user, and changed their platform to show tweets in order of this ranking.[76] Previously, tweets were simply shown in reverse chronological order.[76] The photo sharing appVSCO used TensorFlow to help suggest custom filters for photos.[75]

Search Engine

[edit]

Google officially releasedRankBrain on October 26, 2015, backed by TensorFlow.[77]

Education

[edit]

InSpace, a virtual learning platform, used TensorFlow to filter out toxic chat messages in classrooms.[78] Liulishuo, an online English learning platform, utilized TensorFlow to create an adaptive curriculum for each student.[79] TensorFlow was used to accurately assess a student's current abilities, and also helped decide the best future content to show based on those capabilities.[79]

Retail

[edit]

The e-commerce platformCarousell used TensorFlow to provide personalized recommendations for customers.[75] The cosmetics company ModiFace used TensorFlow to create an augmented reality experience for customers to test various shades of make-up on their face.[80]

2016 comparison of original photo (left) and with TensorFlowneural style applied (right)

Research

[edit]

TensorFlow is the foundation for the automatedimage-captioning softwareDeepDream.[81]

See also

[edit]

References

[edit]
  1. ^ab"Credits".TensorFlow.org.Archived from the original on November 17, 2015. RetrievedNovember 10, 2015.
  2. ^"TensorFlow.js".Archived from the original on May 6, 2018. RetrievedJune 28, 2018.
  3. ^Abadi, Martín; Barham, Paul; Chen, Jianmin; Chen, Zhifeng; Davis, Andy; Dean, Jeffrey; Devin, Matthieu; Ghemawat, Sanjay; Irving, Geoffrey; Isard, Michael; Kudlur, Manjunath; Levenberg, Josh; Monga, Rajat; Moore, Sherry; Murray, Derek G.; Steiner, Benoit; Tucker, Paul; Vasudevan, Vijay; Warden, Pete; Wicke, Martin; Yu, Yuan; Zheng, Xiaoqiang (2016).TensorFlow: A System for Large-Scale Machine Learning(PDF). Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI ’16).arXiv:1605.08695.Archived(PDF) from the original on December 12, 2020. RetrievedOctober 26, 2020.
  4. ^TensorFlow: Open source machine learning. Google. 2015.Archived from the original on November 11, 2021. "It is machine learning software being used for various kinds of perceptual and language understanding tasks" – Jeffrey Dean, minute 0:47 / 2:17 from YouTube clip
  5. ^"Top 30 Open Source Projects".Open Source Project Velocity by CNCF. RetrievedOctober 12, 2023.
  6. ^"Welcome to the PaddlePaddle GitHub".PaddlePaddle Official Github Repo. RetrievedOctober 28, 2024.
  7. ^Video clip by Google about TensorFlow 2015 at minute 0:15/2:17
  8. ^Video clip by Google about TensorFlow 2015 at minute 0:26/2:17
  9. ^Dean et al 2015, p. 2
  10. ^Metz, Cade (November 9, 2015)."Google Just Open Sourced TensorFlow, Its Artificial Intelligence Engine".Wired.Archived from the original on November 9, 2015. RetrievedNovember 10, 2015.
  11. ^abTensorFlow (September 30, 2019)."TensorFlow 2.0 is now available!".Medium.Archived from the original on October 7, 2019. RetrievedNovember 24, 2019.
  12. ^ab"API Documentation".Archived from the original on November 16, 2015. RetrievedJune 27, 2018.,
  13. ^Dean, Jeff; Monga, Rajat; et al. (November 9, 2015)."TensorFlow: Large-scale machine learning on heterogeneous systems"(PDF).TensorFlow.org. Google Research.Archived(PDF) from the original on November 20, 2015. RetrievedNovember 10, 2015.
  14. ^Perez, Sarah (November 9, 2015)."Google Open-Sources The Machine Learning Tech Behind Google Photos Search, Smart Reply And More".TechCrunch.Archived from the original on November 9, 2015. RetrievedNovember 11, 2015.
  15. ^Oremus, Will (November 9, 2015)."What Is TensorFlow, and Why Is Google So Excited About It?".Slate.Archived from the original on November 10, 2015. RetrievedNovember 11, 2015.
  16. ^Ward-Bailey, Jeff (November 25, 2015)."Google chairman: We're making 'real progress' on artificial intelligence".CSMonitor.Archived from the original on September 16, 2015. RetrievedNovember 25, 2015.
  17. ^TensorFlow Developers (2022)."Tensorflow Release 1.0.0".GitHub.doi:10.5281/zenodo.4724125.Archived from the original on February 27, 2021. RetrievedJuly 24, 2017.
  18. ^Metz, Cade (November 10, 2015)."TensorFlow, Google's Open Source AI, Points to a Fast-Changing Hardware World".Wired.Archived from the original on November 11, 2015. RetrievedNovember 11, 2015.
  19. ^"Introduction to tensors". tensorflow.org.Archived from the original on May 26, 2024. RetrievedMarch 3, 2024.
  20. ^Machine Learning: Google I/O 2016 Minute 07:30/44:44.Archived December 21, 2016, at theWayback Machine. Retrieved 5 June 2016.
  21. ^TensorFlow (March 30, 2018)."Introducing TensorFlow.js: Machine Learning in Javascript".Medium.Archived from the original on March 30, 2018. RetrievedMay 24, 2019.
  22. ^TensorFlow (January 14, 2019)."What's coming in TensorFlow 2.0".Medium.Archived from the original on January 14, 2019. RetrievedMay 24, 2019.
  23. ^TensorFlow (May 9, 2019)."Introducing TensorFlow Graphics: Computer Graphics Meets Deep Learning".Medium.Archived from the original on May 9, 2019. RetrievedMay 24, 2019.
  24. ^Jouppi, Norm."Google supercharges machine learning tasks with TPU custom chip".Google Cloud Platform Blog.Archived from the original on May 18, 2016. RetrievedMay 19, 2016.
  25. ^"Build and train machine learning models on our new Google Cloud TPUs".Google. May 17, 2017.Archived from the original on May 17, 2017. RetrievedMay 18, 2017.
  26. ^"Cloud TPU".Google Cloud.Archived from the original on May 17, 2017. RetrievedMay 24, 2019.
  27. ^"Cloud TPU machine learning accelerators now available in beta".Google Cloud Platform Blog.Archived from the original on February 12, 2018. RetrievedFebruary 12, 2018.
  28. ^Kundu, Kishalaya (July 26, 2018)."Google Announces Edge TPU, Cloud IoT Edge at Cloud Next 2018".Beebom.Archived from the original on May 26, 2024. RetrievedFebruary 2, 2019.
  29. ^"Google's new machine learning framework is going to put more AI on your phone". May 17, 2017.Archived from the original on May 17, 2017. RetrievedMay 19, 2017.
  30. ^TensorFlow (January 16, 2019)."TensorFlow Lite Now Faster with Mobile GPUs (Developer Preview)".Medium.Archived from the original on January 16, 2019. RetrievedMay 24, 2019.
  31. ^"uTensor and Tensor Flow Announcement | Mbed".os.mbed.com.Archived from the original on May 9, 2019. RetrievedMay 24, 2019.
  32. ^abHe, Horace (October 10, 2019)."The State of Machine Learning Frameworks in 2019". The Gradient.Archived from the original on October 10, 2019. RetrievedMay 22, 2020.
  33. ^abCiaramella, Alberto; Ciaramella, Marco (July 2024).Introduction to Artificial Intelligence: from data analysis to generative AI. Intellisemantic Editions.ISBN 9788894787603.
  34. ^ab"Introduction to gradients and automatic differentiation".TensorFlow.Archived from the original on October 28, 2021. RetrievedNovember 4, 2021.
  35. ^abc"Eager execution | TensorFlow Core".TensorFlow.Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  36. ^ab"Module: tf.distribute | TensorFlow Core v2.6.1".TensorFlow.Archived from the original on May 26, 2024. RetrievedNovember 4, 2021.
  37. ^Sigeru., Omatu (2014).Distributed Computing and Artificial Intelligence, 11th International Conference. Springer International Publishing.ISBN 978-3-319-07593-8.OCLC 980886715.Archived from the original on May 26, 2024. RetrievedNovember 4, 2021.
  38. ^ab"Module: tf.losses | TensorFlow Core v2.6.1".TensorFlow.Archived from the original on October 27, 2021. RetrievedNovember 4, 2021.
  39. ^"Module: tf.metrics | TensorFlow Core v2.6.1".TensorFlow.Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  40. ^ab"Module: tf.nn | TensorFlow Core v2.7.0".TensorFlow.Archived from the original on May 26, 2024. RetrievedNovember 6, 2021.
  41. ^"Module: tf.optimizers | TensorFlow Core v2.7.0".TensorFlow.Archived from the original on October 30, 2021. RetrievedNovember 6, 2021.
  42. ^Dogo, E. M.; Afolabi, O. J.; Nwulu, N. I.; Twala, B.; Aigbavboa, C. O. (December 2018)."A Comparative Analysis of Gradient Descent-Based Optimization Algorithms on Convolutional Neural Networks".2018 International Conference on Computational Techniques, Electronics and Mechanical Systems (CTEMS). pp. 92–99.doi:10.1109/CTEMS.2018.8769211.ISBN 978-1-5386-7709-4.S2CID 198931032.Archived from the original on May 26, 2024. RetrievedJuly 25, 2023.
  43. ^"TensorFlow Core | Machine Learning for Beginners and Experts".TensorFlow.Archived from the original on January 20, 2023. RetrievedNovember 4, 2021.
  44. ^abc"Introduction to TensorFlow".TensorFlow.Archived from the original on January 20, 2023. RetrievedOctober 28, 2021.
  45. ^"All symbols in TensorFlow 2 | TensorFlow Core v2.7.0".TensorFlow.Archived from the original on November 6, 2021. RetrievedNovember 6, 2021.
  46. ^"TensorFlow.js".js.tensorflow.org.Archived from the original on May 26, 2024. RetrievedNovember 6, 2021.
  47. ^"TensorFlow C++ API Reference | TensorFlow Core v2.7.0".TensorFlow.Archived from the original on January 20, 2023. RetrievedNovember 6, 2021.
  48. ^"org.tensorflow | Java".TensorFlow.Archived from the original on November 6, 2021. RetrievedNovember 6, 2021.
  49. ^Icaza, Miguel de (February 17, 2018)."TensorFlowSharp: TensorFlow API for .NET languages".GitHub.Archived from the original on July 24, 2017. RetrievedFebruary 18, 2018.
  50. ^Chen, Haiping (December 11, 2018)."TensorFlow.NET: .NET Standard bindings for TensorFlow".GitHub.Archived from the original on July 12, 2019. RetrievedDecember 11, 2018.
  51. ^"haskell: Haskell bindings for TensorFlow". tensorflow. February 17, 2018.Archived from the original on July 24, 2017. RetrievedFebruary 18, 2018.
  52. ^Malmaud, Jon (August 12, 2019)."A Julia wrapper for TensorFlow".GitHub.Archived from the original on July 24, 2017. RetrievedAugust 14, 2019.operations like sin, * (matrix multiplication), .* (element-wise multiplication), etc [..]. Compare to Python, which requires learning specialized namespaced functions like tf.matmul.
  53. ^"A MATLAB wrapper for TensorFlow Core".GitHub. November 3, 2019.Archived from the original on September 14, 2020. RetrievedFebruary 13, 2020.
  54. ^"Use TensorFlow from Pascal (FreePascal, Lazarus, etc.)".GitHub. January 19, 2023.Archived from the original on January 20, 2023. RetrievedJanuary 20, 2023.
  55. ^"tensorflow: TensorFlow for R". RStudio. February 17, 2018.Archived from the original on January 4, 2017. RetrievedFebruary 18, 2018.
  56. ^Platanios, Anthony (February 17, 2018)."tensorflow_scala: TensorFlow API for the Scala Programming Language".GitHub.Archived from the original on February 18, 2019. RetrievedFebruary 18, 2018.
  57. ^"rust: Rust language bindings for TensorFlow". tensorflow. February 17, 2018.Archived from the original on July 24, 2017. RetrievedFebruary 18, 2018.
  58. ^Mazare, Laurent (February 16, 2018)."tensorflow-ocaml: OCaml bindings for TensorFlow".GitHub.Archived from the original on June 11, 2018. RetrievedFebruary 18, 2018.
  59. ^"fazibear/tensorflow.cr".GitHub.Archived from the original on June 27, 2018. RetrievedOctober 10, 2018.
  60. ^"tensorflow package - github.com/tensorflow/tensorflow/tensorflow/go - pkg.go.dev".pkg.go.dev.Archived from the original on November 6, 2021. RetrievedNovember 6, 2021.
  61. ^"Swift for TensorFlow (In Archive Mode)".TensorFlow.Archived from the original on November 6, 2021. RetrievedNovember 6, 2021.
  62. ^"TensorFlow.js | Machine Learning for JavaScript Developers".TensorFlow.Archived from the original on November 4, 2021. RetrievedOctober 28, 2021.
  63. ^"TensorFlow Lite | ML for Mobile and Edge Devices".TensorFlow.Archived from the original on November 4, 2021. RetrievedNovember 1, 2021.
  64. ^ab"TensorFlow Lite".TensorFlow.Archived from the original on November 2, 2021. RetrievedNovember 1, 2021.
  65. ^ab"TensorFlow Extended (TFX) | ML Production Pipelines".TensorFlow.Archived from the original on November 4, 2021. RetrievedNovember 2, 2021.
  66. ^abc"Customization basics: tensors and operations | TensorFlow Core".TensorFlow.Archived from the original on November 6, 2021. RetrievedNovember 6, 2021.
  67. ^ab"Guide | TensorFlow Core".TensorFlow.Archived from the original on July 17, 2019. RetrievedNovember 4, 2021.
  68. ^ab"Libraries & extensions".TensorFlow.Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  69. ^"Colaboratory – Google".research.google.com.Archived from the original on October 24, 2017. RetrievedNovember 10, 2018.
  70. ^"Google Colaboratory".colab.research.google.com.Archived from the original on February 3, 2021. RetrievedNovember 6, 2021.
  71. ^abBradbury, James; Frostig, Roy; Hawkins, Peter; Johnson, Matthew James; Leary, Chris; MacLaurin, Dougal; Necula, George; Paszke, Adam; Vanderplas, Jake; Wanderman-Milne, Skye; Zhang, Qiao (June 18, 2022),"JAX: Autograd and XLA",Astrophysics Source Code Library, Google,Bibcode:2021ascl.soft11002B, archived fromthe original on June 18, 2022, retrievedJune 18, 2022
  72. ^"Using JAX to accelerate our research".www.deepmind.com.Archived from the original on June 18, 2022. RetrievedJune 18, 2022.
  73. ^"Why is Google's JAX so popular?".Analytics India Magazine. April 25, 2022.Archived from the original on June 18, 2022. RetrievedJune 18, 2022.
  74. ^"Intelligent Scanning Using Deep Learning for MRI".Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  75. ^abcd"Case Studies and Mentions".TensorFlow.Archived from the original on October 26, 2021. RetrievedNovember 4, 2021.
  76. ^ab"Ranking Tweets with TensorFlow".Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  77. ^Davies, Dave (September 2, 2020)."A Complete Guide to the Google RankBrain Algorithm".Search Engine Journal.Archived from the original on November 6, 2021. RetrievedOctober 15, 2024.
  78. ^"InSpace: A new video conferencing platform that uses TensorFlow.js for toxicity filters in chat".Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  79. ^abXulin."流利说基于 TensorFlow 的自适应系统实践".Weixin Official Accounts Platform.Archived from the original on November 6, 2021. RetrievedNovember 4, 2021.
  80. ^"How Modiface utilized TensorFlow.js in production for AR makeup try on in the browser".Archived from the original on November 4, 2021. RetrievedNovember 4, 2021.
  81. ^Byrne, Michael (November 11, 2015)."Google Offers Up Its Entire Machine Learning Library as Open-Source Software".Vice.Archived from the original on January 25, 2021. RetrievedNovember 11, 2015.

Further reading

[edit]

External links

[edit]
Computer programs
AlphaGo
Versions
Competitions
In popular culture
Other
Machine learning
Neural networks
Other
Generative AI
Chatbots
Language models
Other
See also
Open source
Proprietary
Differentiable computing
General
Hardware
Software libraries
Google free and open-source software
Software
Applications
Programming languages
Frameworks and
development tools
Operating systems
Related
Retrieved from "https://en.wikipedia.org/w/index.php?title=TensorFlow&oldid=1278629998"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp