{"id":817,"date":"2025-07-29T20:00:00","date_gmt":"2025-07-29T16:00:00","guid":{"rendered":"http:\/\/www.actutech.app\/googles-ai-mode-update-adds-even-more-tools-for-students\/"},"modified":"2025-07-29T20:00:00","modified_gmt":"2025-07-29T16:00:00","slug":"googles-ai-mode-update-adds-even-more-tools-for-students","status":"publish","type":"post","link":"http:\/\/www.actutech.app\/en\/googles-ai-mode-update-adds-even-more-tools-for-students\/","title":{"rendered":"Google\u2019s AI Mode update adds even more tools for students"},"content":{"rendered":"<figure>\n<p><img decoding=\"async\" alt=\"\" data-caption=\"Users will soon be able to upload PDFs to AI Mode, too. | Image: Google\" data-portal-copyright=\"Image: Google\" data-has-syndication-rights=\"1\" src=\"https:\/\/platform.theverge.com\/wp-content\/uploads\/sites\/2\/2025\/07\/File-upload-in-AI-Mode-1-of-2.png?quality=90&amp;strip=all&amp;crop=0,0,100,100\" \/><figcaption>\n\tUsers will soon be able to upload PDFs to AI Mode, too. | Image: Google\t<\/figcaption><\/p><\/figure>\n<p class=\"has-text-align-none\">Google is bringing a bunch of new features to AI Mode, and is positioning the update as a way to help students study for tests or dig deeper into what they\u2019re learning. Today, the company announced that it will now let users upload images to AI Mode on desktop, allowing them to ask questions about what they\u2019re seeing, whether it\u2019s a homework math problem or a plant they want to learn more about.<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/platform.theverge.com\/wp-content\/uploads\/sites\/2\/2025\/07\/ai-mode-search-live-video.gif?quality=90&amp;strip=all&amp;crop=0,0,100,100\" alt=\"\" title=\"\" data-has-syndication-rights=\"1\" data-caption=\"\" data-portal-copyright=\"GIF: Google\" \/><\/p>\n<p class=\"has-text-align-none\">In May, <a href=\"https:\/\/www.theverge.com\/google-io\/670439\/google-ai-mode-search-io-2025\" target=\"_blank\" rel=\"noopener\">Google built AI Mode into Search<\/a> in the US, which searches the web and summarizes its findings for users. It also lets users ask follow-up questions, as well as have a back-and-forth conversation with the tool. Google <a href=\"https:\/\/www.theverge.com\/news\/644363\/google-search-ai-mode-multimodal-lens-image-recognition\" target=\"_blank\" rel=\"noopener\">launched the ability<\/a> to upload images to AI Mode while still testing the feature in April; adding it to desktop could make it easier for students to get help on projects or assignments that they\u2019re working on.<\/p>\n<p class=\"has-text-align-none\">Other changes coming soon include <a href=\"https:\/\/www.theverge.com\/news\/670597\/google-search-live-ai-mode-gemini-ios\" target=\"_blank\" rel=\"noopener\">a test of real-time camera sharing<\/a> in AI Mode, building upon the <a href=\"https:\/\/www.theverge.com\/news\/689212\/google-search-live-ai-mode-test\" target=\"_blank\" rel=\"noopener\">Search Live features it already has<\/a>. Now, instead of just having a spoken conversation with AI Mode\u2019s custom version of Gemini, users can point their camera at whatever they have a question about and ask about it aloud. This feature is coming to mobile users in the US who have opted into the AI Mode Labs experiment.<\/p>\n<p class=\"has-text-align-none\">In addition, Google is trying to make it <a href=\"https:\/\/www.theverge.com\/2024\/8\/1\/24210670\/google-chrome-lens-ai-search-history\" target=\"_blank\" rel=\"noopener\">easier to access Lens in Chrome<\/a> by displaying a new \u201cAsk Google about this page\u201d option when users click on the address bar in Chrome. When users select this option, the tool will generate an AI Overview of the webpage\u2019s content directly in the browser\u2019s sidebar. Google also plans on letting users ask additional questions about a Lens response by choosing \u201cAI Mode\u201d at the top of Lens results and selecting \u201cDive deeper.\u201d\u00a0<\/p>\n<p><img decoding=\"async\" src=\"https:\/\/platform.theverge.com\/wp-content\/uploads\/sites\/2\/2025\/07\/Lens-in-Chrome-desktop.png?quality=90&amp;strip=all&amp;crop=0,0,100,100\" alt=\"\" title=\"\" data-has-syndication-rights=\"1\" data-caption=\"&lt;em&gt;Chrome will soon start displaying an \u201cAsk Google about this page\u201d option in the address bar.&lt;\/em&gt; | Image: Google\" data-portal-copyright=\"Image: Google\" \/><\/p>\n<p class=\"has-text-align-none\">Further out, Google will start letting users upload PDFs to AI Mode and pull in files from their Google Drive. The company is testing Canvas in AI Mode on desktop as well.<\/p>\n<p class=\"has-text-align-none\"><a href=\"https:\/\/www.theverge.com\/google\/631726\/google-gemini-canvas-audio-overview-notebooklm-coding\" target=\"_blank\" rel=\"noopener\">Google first launched Canvas in Gemini<\/a> in March, serving as a workspace where users can ask Gemini to help refine their writing, build apps, create games, generate interactive quizzes, and more. The company\u2019s announcement says bringing Canvas to AI Mode can help students create study guides by pulling together information in the Canvas sidebar, allowing them to tweak its output in real time with additional questions. <\/p>\n<p><img decoding=\"async\" src=\"https:\/\/platform.theverge.com\/wp-content\/uploads\/sites\/2\/2025\/07\/Canvas-in-AI-Mode.png?quality=90&amp;strip=all&amp;crop=0,16.666666666667,100,66.666666666667\" alt=\"\" title=\"\" data-has-syndication-rights=\"1\" data-caption=\"&lt;em&gt;Canvas can help students build study guides and more.&lt;\/em&gt; | Image: Google\" data-portal-copyright=\"Image: Google\" \/><\/p>\n<p class=\"has-text-align-none\">Canvas in AI Mode will be available in the \u201ccoming weeks\u201d to US desktop users who enable the experiment through Search Labs.<\/p>","protected":false},"excerpt":{"rendered":"<p>Users will soon be able to upload PDFs to AI Mode, too. | Image: Google Google is bringing a bunch [&hellip;]<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-817","post","type-post","status-publish","format-standard","hentry","category-non-classe"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts\/817","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/comments?post=817"}],"version-history":[{"count":0,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/posts\/817\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/media?parent=817"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/categories?post=817"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.actutech.app\/en\/wp-json\/wp\/v2\/tags?post=817"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}