Local AI and NAS Drives – IS THIS A GOOD THING?

The Pros and Cons of AI Use in Your NAS

In recent years, Artificial Intelligence (AI) and Large Language Models (LLMs) have emerged as transformative technologies across various industries. Their ability to process vast amounts of data, automate complex tasks, and provide intuitive user interactions has made them invaluable in applications ranging from customer service to data analysis. Now, these technologies are making their way into the realm of Network Attached Storage (NAS) devices, promising to revolutionize how users store, manage, and interact with their data.

The integration of AI and LLMs into NAS systems is more than just a buzzword—it represents a shift toward smarter, more efficient data management. From improving search and categorization through AI recognition to enabling natural language commands for administrative tasks, the potential applications are vast. However, these advancements also bring challenges, particularly in terms of security and data privacy. In this article, we’ll explore the current state of AI and LLM use in NAS devices, the benefits of local deployment, the security concerns that need addressing, and the brands leading this exciting transformation. Whether you’re a tech enthusiast, a small business owner, or a large enterprise, the rise of AI-powered NAS systems is a development worth understanding.

The Use of AI Recognition in NAS to Date

AI has steadily evolved in NAS systems, but its use has largely focused on recognition tasks rather than broader assistance or intelligence. In its early stages, AI in NAS was synonymous with recognition technology in photo and video management. This included tagging and categorizing images by identifying objects, people, animals, and scenery. These tools offered a way to organize vast amounts of data efficiently but required manual intervention to capitalize on their functionality. While helpful, these recognition tasks were limited in scope and often felt like minor conveniences rather than transformative innovations. They were more about helping users sift through data than empowering them to interact with it dynamically.

Surveillance was another area where AI found its niche in NAS systems. AI-powered surveillance solutions could identify individuals or objects in live video streams, providing real-time alerts and aiding in security operations. However, this application came with significant resource demands, requiring high-performance CPUs, GPUs, and robust storage solutions to process the data effectively. For example, recognizing someone as “David Trent” and verifying their access rights demanded not only live video analysis but also database integration. While advancements in hardware have made these processes less resource-intensive, they still remain confined to niche use cases. The introduction of large language models (LLMs) in NAS systems is set to change this, offering more versatile, interactive, and user-friendly AI capabilities.


How Are NAS Brands Going to Take Advantage of AI?

In we to ignore the clear profitability that integrating AI services will bring to all NAS manufacturers as a means to promote their products, we need to all agree that presenting the clearest, easier and jargon-free means for a user to access their data is paramount. To go on the briefest tangent, lets say we look at the cloud industry and the NAS industry over the last 2 decades – one (cloud) provides convenience, simplicity and low cost of entry, whereas the other (NAS) provides better long term value (TCO), capacity and security control. Up until the last few years, in an increasingly AI-data driven market place, cloud services have been able to leverage AI more affordably to end users thanks to all computations happening at the cloud (i.e remote data center/farm level). However, the market is changing and alongside increased affordably of bare metal server solutions, there is a growing awareness as to the security hazards of your data being on ‘someone elses computer’ and the growing ability for AI powered processes to be made possible partially and/or fully on local bare metal servers on site.

 

We are already seeing how NAS manufacturers of all types are leveraging AI services. We can go back a few years to the emergence of AI powered photo recognition, improvements in OCR and audio transcription to made data more three dimensional, AI powered identifcation that make make informed decisions based on that specific person/target that is identified (eg domestic surveillance) – these are all now commonplace and done better of local server storage versus cloud. However it is the next few years that excite me that most. Now that businesses and home users alike are sitting on DECADES of data – it has fallen to AI to create the best way to access this data in a timely and efficient manner. Outside of the hyperscale and unified storage tier, most users cannot (and do not) store their TERABYTES of data on the cloud – to costly, and in some cases, just not possible. So as NAS manufacturers and hardware manufacturers successfully rise to this challenge with AI pwoered tools, they are in a market ‘sweet spot’ where there is both the demand for this solution and the technology to present it affordable.

In short, we are going to see the improved accessibility of data made possible via AI language models that are going to allow non network storage or IT minded professionals to be able to get their data to do WHAT they want, WHEN they want, and HOW they want – securely and safely!


The Security Issues of AI/LLM Use in NAS Systems Right Now

Although the security concerns of AI use and AI access are multi-faceted (which is arguably true of any data-managed or accelerated appliance), the main security implementations stem from unauthorized access. From theft or ransomware, to traditional personal privacy – the bulk of security concerns come down to finding a balance between ease of access, speed of response and security of the process from beginning to end. Encrypted tunnels whereby data is locked from access at the start point and unpacked at the endpoint have been vital in this process, but also rely heavily on a powerful yet efficient host/client hardware relationship to get this done – yet another benefit of powerful local/bare metal servers that allow direct user hardware control. But also tailored security access, control of bare metal systems to create airgaps, custom authentication patterns, selective access locations, and probably most important of all – FULL control of that host/client delivery. The real market demand right now in NAS is for archival and/or hot/live data to spend no time on any other server that is not your own. Encrypted or not, businesses and home users alike are still exceptionally weary of having their corporate/personal confidential data on “someone else’s computer” – at best because they do not want it used as training data for someone else’s AI model, and at worst because they do not want their privacy fundamentally infringed. A local AI that is running in and through a bare metal NAS will lock in access to a single controlled site, with the added benefit of total control of each part of the data transaction.

Source – https://www.leewayhertz.com/data-security-in-ai-systems/

The integration of AI and LLM services in NAS devices brings numerous opportunities, but it also exposes users to critical security risks. The most prominent concern is the reliance on cloud-based AI platforms like ChatGPT, Google Gemini, and Azure AI. These services require data to be sent to remote servers for processing, which can create vulnerabilities. Organizations handling sensitive or regulated data, such as healthcare providers, law firms, and financial institutions, face significant risks when relying on these external platforms. Even with encrypted transfers and secure API keys, the mere act of sending data offsite can violate privacy regulations and increase exposure to cyber threats.

Another issue is the potential for misuse or unintentional inclusion of user data in external AI training datasets. Even if data isn’t directly accessed by unauthorized parties, it may still contribute to refining and training external AI systems. This lack of transparency creates distrust among users, particularly those who have invested in NAS systems to avoid cloud dependence in the first place. Regulatory environments such as GDPR and HIPAA further complicate the picture, as these frameworks impose stringent requirements on data privacy. For businesses that prioritize confidentiality, these risks underscore the importance of locally deployed AI solutions that keep data within their private networks.


How have recent CPU developments Improved local AI NAS Use?

AI Processes are always going to be heavily constrained to the speed and capability of the system hardware (whether that is the client or the host server) and having access to an enormous database of data from which to subtract information and an AI/LLM to interface with it effectively are just 2 pieces of the puzzle. In many ways alot of this has already been possible for well over a decade. Having efficient yet powerful hardware to do this has only really been conventionally possible in the way we need it thanks to power vs output tradeoffs in modern hardware. The shift in CPU profiling slightly away from traditional cores, threads and clock speeds and now towards accounting for NPU abilities and intelligent CPU/GPU offload has been quick, but vital. Having “the fastest horse in the race” is no longer the be-all-end-all.

 

Key Feature CPU (Central Processing Unit) NPU (Neural Processing Unit) GPU (Graphics Processing Unit)
Core Functionality Versatile processor for managing general computing tasks like running software and handling operating system operations. Purpose-built for AI and machine learning workloads, specializing in neural network processing and inference tasks. Primarily designed for parallel processing, focusing on rendering graphics and accelerating computation-heavy operations.
Core Design Few cores optimized for linear processing and multitasking capabilities. Hundreds to thousands of small cores designed for efficiently handling matrix and tensor operations in deep learning applications. Built with hundreds to thousands of cores tailored for large-scale parallel computations.
Performance Strength Well-suited for a variety of tasks but less effective for operations requiring massive parallelism. Delivers exceptional performance for AI inference, training, and related tasks, with low latency and high efficiency. Excels at high-throughput tasks like graphics rendering, video processing, and AI model training.
Primary Applications Best for everyday computing, including spreadsheets, application management, and operating system processes. Ideal for AI-related tasks, such as natural language processing, voice recognition, and image analysis. Perfect for high-demand graphical and parallel workloads like gaming, video editing, and scientific simulations.
Power Efficiency Consumes more energy for tasks outside its intended scope but is generally optimized for standard workloads. Highly energy-efficient for AI operations, ideal for low-power devices, edge applications, and data centers. Moderately power-efficient for intensive parallel workloads, optimized for tasks like gaming and rendering.
Overall Summary The CPU is the all-purpose processing unit, excelling in versatility but less specialized for parallel-intensive tasks. The NPU is the AI-focused unit, optimized for high performance in deep learning and neural network computations. The GPU is the parallel processing powerhouse, designed for rendering and computationally demanding tasks.

Now that the benefits of utility of AL/LLM in both home and business are pretty well established, this has resulted in huge development towards processors that seek to find the balancing point between power used and power needed. Intel has been by and large the biggest player in this market and already. The Uktra Core series and the recently launched Xeon 6 processors are pivoting alot of the AI learning and execution away from the heavier and more power-consuming GPU activities (as well as delegating resource use internally as needed when the need arises). The result is that these newer wave of better balanced CPU vs GPU vs NPU processors are going to be significantly better at handling future AI-managed processes, as well as reducing power consumption and shrinking the systems that are managing hundreds of thousands of these operations. Add to that the benefits of localised AI processes requiring bare metal systems with the capabilities to get the job done without cloud resources, and Intel are in a great position right now to dominate this AI processor space.


The Benefits of Local AI/LLM Deployment

Deploying AI and LLM services locally on NAS devices addresses many of the security and compliance concerns associated with cloud-dependent solutions. When AI operates entirely within the confines of a NAS, sensitive data never leaves the user’s controlled environment. This eliminates the risk of unauthorized access, data leaks, or inadvertent inclusion in external AI training. Industries like healthcare, finance, and legal services stand to benefit immensely, as they often handle data that is highly sensitive and subject to strict regulatory standards.

In addition to bolstering privacy, local AI deployment also offers substantial performance advantages. Tasks such as querying databases, generating reports, or categorizing large datasets can be processed faster since they don’t rely on an internet connection. For users in remote locations or those with unreliable internet, this capability ensures consistent performance. Furthermore, local deployment allows for highly customized AI models tailored to specific organizational needs, from managing unique workflows to optimizing resource allocation. By keeping AI processing close to the data source, local deployment combines efficiency, security, and adaptability in a way that cloud solutions cannot match.


The Administrative and Usage Benefits of Local AI/LLM Services on NAS Storage

The integration of local AI/LLM services into NAS systems not only enhances security but also revolutionizes the way users interact with their devices. One of the standout features is the ability to use natural language commands for system management. This eliminates the complexity of navigating intricate menus and understanding technical jargon. For instance, instead of manually adjusting user permissions or configuring backups, users can issue simple commands like “Share this folder with the marketing team” or “Backup all files from this month.” The system interprets these instructions and executes them seamlessly, saving time and reducing frustration.

From an administrative perspective, this functionality is a game-changer. IT professionals can automate repetitive tasks such as user management, system monitoring, and data organization, freeing them to focus on strategic initiatives. For smaller businesses or individual users, this democratization of technology reduces the learning curve, making advanced NAS functionalities accessible to non-technical users. Additionally, local AI systems can analyze usage patterns to optimize system performance, flag potential issues before they escalate, and even suggest improvements. Whether for personal use or enterprise deployments, local AI and LLM services make NAS devices more intuitive and effective tools.


Where are Hard Drives in local AI NAS Use? Are they too slow?

The database that an AI service or infrastructure interfaces with, as well as the core system hardware that supports the LLM/Training model are only going to be as effective as the delivery for the data from point A to point B. Having a super intelligent and well-trained AI connected to a comparative continent of records and information is largely useless if the delivery of the data is being bottlenecked. Until recent years, the answer was thought to be SSDs – with their higher IO and superior simultaneous access abilities. However their own individual shortfall has always been one of capacity – the sheer growth weight of NEW data at any given moment is truly astronomical, and that is not even factoring in the scale of an existing database.

Therefore Hard Drives once again have had to come to the rescue and although the reduced TCO of HDDs, as well as their phenomenal capacity growth, has been hugely appealing – HDD technology has not been sleeping during the AI boom! Intelligent caching, the benefits of multi-drive read/write operations in large scale RAID environments, AI-driven tiered storage alongside strategic SSD deployment – all of this and more have resulted in an AI driven world that, rather than turning it’s back of mechanical hard disks, has actual embraced and increased their utility!


Brands Currently Engaging with Local AI Deployment

The integration of local AI services in NAS systems is no longer a niche feature, with several brands leading the charge in this space. Synology, a long-established player in the NAS market, has developed an AI Admin Console that allows users to integrate third-party LLMs like ChatGPT and Gemini. While this approach relies on external platforms, it offers granular controls to limit data exposure, providing a middle ground for users who want advanced AI features without fully sacrificing security. This hybrid model appeals to users who need both functionality and control.

Zettlabs, a lesser-known yet innovative company, has embraced fully local AI solutions. During a demonstration at IFA Berlin, Zettlabs showcased a crowdfunding project featuring offline AI capabilities. The system processed complex queries using only local datasets, such as querying an eBook database or analyzing medical records without requiring internet access. This approach highlights the potential for offline AI in specialized industries like healthcare and education. UGREEN, a brand known for its DXP systems, is also exploring local AI deployment. Their systems focus on efficient offline processing and interactive management, providing another compelling option for users seeking privacy-first AI solutions. Together, these brands are shaping the future of AI-powered NAS devices by prioritizing user privacy and functionality.


Is AI and LLMs in NAS A Good Thing?

The integration of AI and LLM services into NAS systems is poised to transform how users manage and interact with their data. By automating complex processes, simplifying interfaces, and enhancing overall efficiency, AI-enabled NAS devices are unlocking new possibilities for both personal and professional use. However, the security challenges posed by cloud-reliant AI solutions highlight the critical need for locally deployed systems that prioritize data sovereignty and user control.

As brands like Synology engage with integrating 3rd party cloud AI/LLM services into their collaboration suite, QNAP integrates AI into their systems with modual TPU upgrades that QuTS/QTS can harness, and brand like Zettlabs and UGREEN start rolling out local AI deployment affordably, the market is rapidly evolving to meet the needs of diverse users. These advancements not only address privacy concerns but also open the door to more versatile and intuitive NAS functionalities. Whether through hybrid solutions that offer controlled cloud integration or fully offline systems designed for maximum security, the future of AI-powered NAS is promising. For users willing to embrace this technology, the combination of local AI’s speed, customization, and privacy ensures a more efficient and secure data management experience. As these systems mature, they are set to become indispensable tools in the digital age.

📧 SUBSCRIBE TO OUR NEWSLETTER 🔔


    🔒 Join Inner Circle

    Get an alert every time something gets added to this specific article!


    Want to follow specific category? 📧 Subscribe

    This description contains links to Amazon. These links will take you to some of the products mentioned in today's content. As an Amazon Associate, I earn from qualifying purchases. Visit the NASCompares Deal Finder to find the best place to buy this device in your region, based on Service, Support and Reputation - Just Search for your NAS Drive in the Box Below

    Need Advice on Data Storage from an Expert?

    Finally, for free advice about your setup, just leave a message in the comments below here at NASCompares.com and we will get back to you. Need Help? Where possible (and where appropriate) please provide as much information about your requirements, as then I can arrange the best answer and solution to your needs. Do not worry about your e-mail address being required, it will NOT be used in a mailing list and will NOT be used in any way other than to respond to your enquiry.

      By clicking SEND you accept this Privacy Policy
      Question will be added on Q&A forum. You will receive an email from us when someone replies to it.
      🔒Private Fast Track Message (1-24Hours)

      TRY CHAT Terms and Conditions
      If you like this service, please consider supporting us. We use affiliate links on the blog allowing NAScompares information and advice service to be free of charge to you.Anything you purchase on the day you click on our links will generate a small commission which isused to run the website. Here is a link for Amazon and B&H.You can also get me a ☕ Ko-fi or old school Paypal. Thanks!To find out more about how to support this advice service check HEREIf you need to fix or configure a NAS, check Fiver Have you thought about helping others with your knowledge? Find Instructions Here  
       
      Or support us by using our affiliate links on Amazon UK and Amazon US
          
       
      Alternatively, why not ask me on the ASK NASCompares forum, by clicking the button below. This is a community hub that serves as a place that I can answer your question, chew the fat, share new release information and even get corrections posted. I will always get around to answering ALL queries, but as a one-man operation, I cannot promise speed! So by sharing your query in the ASK NASCompares section below, you can get a better range of solutions and suggestions, alongside my own.

      ☕ WE LOVE COFFEE ☕

       
      locked content ko-fi subscribe

      DISCUSS with others your opinion about this subject.
      ASK questions to NAS community
      SHARE more details what you have found on this subject
      CONTRIBUTE with your own article or review. Click HERE
      IMPROVE this niche ecosystem, let us know what to change/fix on this site
      EARN KO-FI Share your knowledge with others and get paid for it! Click HERE

      101 thoughts on “Local AI and NAS Drives – IS THIS A GOOD THING?

      1. Wow, you’ve expressed many of my wishes. Voice control is one thing I wish for. Aside from my own desires just think what it could for people who are disabled in some way, maybe they can’t use a keyboard for example. My limited understanding leads me to the connection with gpu vram. You need an expensive card with a lot of vram to run some ‘models’. Perhaps if you were willing to limit the scope of your Voice Control AI you may be to get away with more modest hardware requirements. I must admit I don’t understand why most A.I, can’t use the NPU’s that adorn many cpu’s nowadays.
        These are just the musings of an interested amateur.
        I enjoy your channel, and I hope to learn more about nas’s and to eventually purchase or build my own. Then it will be the cost of the HDD’s that will concern me, (lol).
        REPLY ON YOUTUBE

      2. If you’re processing video from your cameras, ai is great for object detection. Honestly thats the biggest use case for me (Possibly a reason to occupy a precious pcie slot with a Coral TPU).
        REPLY ON YOUTUBE

      3. To me AI is a bit of marketing-hype.
        As if it is a word (acronym) that needs to be included to make any
        AI can be helpful but it is not without its quirks, issues and sometimes grave mistakes.
        AI should be seen as a possible aid, provide some level of assistance.
        And should never ever, at its current state, be relied upon as the only “truthful answer”.
        At its best, AI is still in its infancy, daydreams and gives its human subjects painful reminders.
        But with AI you also open-up a can of worms to privacy, security and other highly undesirable affects.
        Both short-term and long-term.
        I perceive AI much like self-driving cars;
        Still work in progress and the occasional (often deadly) head-on collision into a truck with devastating results which can’t always be explained and no solution directly in sight. I am okay without AI, have done so for a long time and hpe to do so for a longer future.
        Until we are all part of Skynet and their first point of action probably would get rid of your seagulls 😉
        REPLY ON YOUTUBE

      4. Hey wouldn’t we like to give AI NAS or DAS a bunch of files and tell what to do with the files
        Like bunch of pictures to properly sort them and named the files and store them
        Or build fake person you’d fall madly in love with lol????????????
        REPLY ON YOUTUBE

      5. We’re at peak hype for AI right now. I mean, damn, my toast is AI now. Do I need LLM on my NAS? Not really. Command and Control purposes? I wouldn’t trust it because a lot of the operations are critical, and failure could have a cost, so I want to be closer to the action. I agree, local AI services are very important, but putting that hardware into a NAS seems like it would be the wrong place. And it would add significant cost to the hardware, which is already at a premium for pre-builts. I do run my own local services, but there is no way I could shoehorn that into a NAS box (Heat, power, memory, etc.) Could my NAS use those services, deployed elsewhere, local or otherwise? Yep, sure.
        REPLY ON YOUTUBE

      6. I think we’ve sown the seeds of our own destruction… but if you set the wild conspiracy stuff aside, AI is incredibly useful locally for sorting and presenting data. I have no issue with an AI instance doing face recognition on my photos or my security cameras, and as you pointed out I’m experimenting with years of archived service data I haven’t been able to discard. However, I absolutely want that AI operating locally only… and not directly on my NAS. I’m one of those paranoid people who thinks a NAS is a terrible place to run apps, my NAS is for storage only. But that’s just me!
        REPLY ON YOUTUBE

      7. If they use AI to actively boost security and educate people on their NAS security and features, it might be useful. For most people, NAS devices are supposed to “Set them up and forget about them” devices. AI should not be actively used except for proactively protecting the devices from malware and ransomware and informing users about it.
        REPLY ON YOUTUBE

      8. Great Video and info ! An early AI machine was the IBM Chessplaying Machine ! Another related AI machine was the HAL 9000… Someone lied to HAL that led to unintended future ! Letting AI get access to the internet, then they can connect and expand there power and scope ! If the connected AI machines expand and find and connect to some Quantum Computers… The NEW AI may migrate to to a new powerful program that thinks thatHumans are primative ANIS ! tjl Timothy Lipinski
        REPLY ON YOUTUBE

      9. No useless comment here Robbie. You said everything that I was thinking and more. I would wish you a Merry Christmas but, based on my prior comments, YouTubes’s AI algorithms probably already said that for me…
        REPLY ON YOUTUBE

      10. I mean, while some don’t want any of their data touched by AI in case it’s being snooped on, but I think its really nice to bale able to search through data to find photos and categorize everything.
        REPLY ON YOUTUBE

      11. Hello, first of all thank you for bringing this nas as content, I ask you if to date you have been able to learn something new about this nas or its operating system? I am very interested when purchasing the pro bay version. Thanks
        REPLY ON YOUTUBE

      12. If your in UK or Europe, time to look elsewhere – I recently had a chat with somebody at UGREEN, at present all plans to expand into EMEA are on hold, as they’ve got sourcing issues.
        REPLY ON YOUTUBE

      13. There is next to no videos or reviews on this other than this and one other guy. I did the hold payment. But worried that this is one of them get your money and disappear products
        REPLY ON YOUTUBE

      14. I like the idea but it feels like a solution looking for a problem. Me personally I don’t have the need, and the companies I work with are all cloud based storage these days.
        REPLY ON YOUTUBE

      15. Framework Compatibility: What AI frameworks and tools are provided in the NAS (pre-installed) for model development, training, and deployment. Is there the possibility to install third-party framework and tools?
        REPLY ON YOUTUBE

      16. I am super interested in knowing more about this Nas. I have a MASSIVE music & music video collection (roughly 70 tb), and being able to use those multiple search functions (with plain text), is going to be really helpful in narrowing down searches.
        …. you mentioned that this isn’t new, does Synology have a feature like this? (searching with plain text to narrow down a search)
        I’m waiting to find out what your review of the 1825+ is going to show. I’ve been wanting the 1821+ for a while but I’m hoping the hardware will be better on the newer 1825+ release.

        you mentioned Ugreen, releasing an LLM version also, any ideas on if I should wait?
        REPLY ON YOUTUBE

      17. If it actually makes data retrieval faster or just uses the AI to better index and help find odd files that are hard to find later, a common type pictures/videos or through masses of text documents.
        REPLY ON YOUTUBE

      18. Any idea what GPU they will be adding to enable local AI processing and how…..? BTW you can do this today with a Zima Cube Pro by buying the version with a gpu or by adding the right one either via the PCI slot or via eGPU. Will be interested to see if Zetta take a different approach!
        REPLY ON YOUTUBE

      19. Doubtless, presumably, AI will be of use but my heart sinks every time I hear the buzzword; and ,also, once Crowd Funded rears its head my inner grumpy old man overflows. Still looks good though.
        REPLY ON YOUTUBE

      20. The benefit is that with offline AI LLM, potentially(!) one should be less(!) worried about your sensitive data, it should be (hopefully) confidential. BTW, offline AI LLM is not new, only on a NAS it would be new, the system comes thus “pre-trained”.
        But there are also some serious caveats when using offline AI LLM.
        For example, when they are using open-source, those chance so often and rely on other sources, that they can break(!) your implementation. Do not ask how I know but suffice to say I have experienced that several times. Sometimes at the worst moments and can become a pain to resolve.
        But let’s wait and see what they come-up with.
        BTW, going k1ckstarter would be for me an immediately full hands-down to me, sorry.
        Let’s not burn them down until something tangible 😉
        REPLY ON YOUTUBE

      21. Imagine your local LLM crawling all your local math and science books, then presenting it for a specified target age/knowledge level. Or telling it to play only songs in my collection that have jazz guitar. This is what I’m hoping for, with the ability of using several models offline. Current solutions are choppy, and require Docker/VM implementations, but with the advantage of high-end hardware for custom builds, and the disadvantage of power optimization and visual the visual appeal of Zettlabs and Ugreen boxes. We’ve come a long way, but still a ways to go.
        REPLY ON YOUTUBE

      22. Okay I’ve got to say where is this because if you were to tell me that this is some kind of a multi-stall shower setup at prison I’d be like that’s a pretty nice prison but I would believe you I don’t think it’s prison because that is awfully nice showers but you are clearly in a shower
        REPLY ON YOUTUBE

      23. 2:00 – Seems like maybe a good pairing with those NVIDIA AI motherboard chipset things or maybe those LIQUID (server hardware vendor) company’s server product solutions, maybe? Is it a ‘lower-end’ solution for less gongho wallet consumers?
        REPLY ON YOUTUBE

      24. In the midst of current innovations, I think it’s time to reconsider premium DIY options. I’ve been trying to round up the best alternatives to HBS3, AiCore, DSM, QSync, etc.
        REPLY ON YOUTUBE

      25. I don’t care about AI in my NAS, just want an amazing drive, photos & at least 4k Plex streaming with one conversion.
        Also, work with companies like Tailscale to have dive & photos works as well as If I’m connected through tailscale but without tailscale so already integrated.
        REPLY ON YOUTUBE

      26. I was thinking today if only they could design a tray with a SAS to SATA Adapter so you would be able to use SAS HDDs in their systems. I don’t know how reliable such an adaptor would be.
        REPLY ON YOUTUBE

      27. I might also add that it’s known that UGreen had NAS models in China well before the Kickstarter release models. So, it might be interesting to see some if not one of the Chinese models compared to the requisite US model. Just for giggles. Near as I can tell the design language is very similar (same HDD trays, same magnetic dust filter) but there are differences.
        REPLY ON YOUTUBE

      28. “Here’s our new product. It does AI!”
        “What does that actually mean? What does it actually do?”
        “…IDK… marketing just told us to say it did AI.”

        This makes me think less of a brand and its products, not more. It’s “VR Ready!!” PCs all over again (which seemed to boil down to “it has an HDMI port”). Completely vacuous unless backed by some actual concrete functionality, and should not be announced until they’re ready to say what that is, if there even is any.
        REPLY ON YOUTUBE

      29. I jumped on as an early supporter of Ugreen and purchased their 4-bay Plus model. Even cancelled my orders for Terramaster’s 424-Pro and QNAP NAS (forget the model but it’s the one with the option to add PCIe slot that can be used for expansion including adding hardware and software for AI). I was hoping Ugreen’s first NAS models might also get the software (and ability to upgrade the hardware for the models that have PCIe expansion slots) for AI capabilities but by releasing AI specific NAS models makes me wonder if the first gen Ugreen models might be left out from getting any AI capabilities (“thanks early adopters, but sorry to abandon you so soon”)
        REPLY ON YOUTUBE

      30. Well I for one am very pleased with my UGreen DXP8800 Plus 8 bay NAS (hardware) thus far. A little less so with UGOS Pro but version 1.0.0.1366 does address some shortcomings. So things are improving albeit not as fast as some would like. It’s good to see that UGreen is showing some dedication to their in-house software as well as future hardware hardware offerings . Game on!
        REPLY ON YOUTUBE

      31. What is that stuff to your left?
        Your entire video is pointless if you don’t explain what that black box is (that you are comparing to) .
        REPLY ON YOUTUBE

      32. Finding your reviews/tutorials so informative, thanks .. but you never seem to mention the size or make of the discs you are kitting the NAS out with .. can you share please
        REPLY ON YOUTUBE

      33. I worked with building and supporting video surveillance servers some years back, and at the time having the system following a person from camera to camera as they walked around was considered advanced. How the times have changed!

        The servers I worked with could handle up to about a hundred cameras simultaneously, and the largest installation I worked on had over two hundred cameras covering a warehouse, loading bays, a couple of parking lots and so on. That was using three servers and the cameras were all using the then much hyped FullHD (1920 x 1080) resolution. When the installation was done they were amazed that they could actually see who was walking around. With their old system all they got was a blurry “someone” doing something at a frame rate of slideshow…

        I can’t help but wonder what the software is like today. They will have had to move with the times obviously, and for the money the customers were paying for the licenses they better be good.
        REPLY ON YOUTUBE

      34. Good explanation of the fundamentals. I’d add to that the “out of the ordinary” where for example the front of the business faces a street which is busy at some times of the day, and quiet at others or at night, but there is occasionally someone hanging around after calling a taxi or waiting for a friend – no big deal until that person hanging around in front of you business is the same one three or more days (or particularly nights) in a row. Now it is worth logging and saving the video because if you later get a break-in, chances are it was them who was casing the place and waiting for the area to be quiet enough for them to risk forcing entry, likely having first found a way to blind the camera.
        A good AI system can learn what is normal over time so that it can log and save video of anything which is not normal. That will require it to be set to learn at the start (or it will drive you mad by alerting you for everything), but after a week or two can be set to only add things to “normal” when they have been manually viewed and cleared. It does need more processing power (about a Raspberry Pi’s worth per camera), but it will pay off in reduced security staff time once it has built up a good database of the normal activity for the area it monitors. Is it worth mentioning that the storage for such a system should be both in the last place a burglar would look and with a mirror to a secure cloud account in case they do find the onsite copy? It also means that completely normal things can be deleted after a fairly short period, and that saves storage space.
        REPLY ON YOUTUBE

      35. Thanks for your video. I am interested in Synology DVA systems, but the DVA1622 is not worth the money they are asking for, and with only a max of 2 DVA tasks, my interest in purchasing is even less. Then they have the other extreme (DVA3221) for $2,500 with a maximum of 12 DVA tasks. The DVA1622 should be priced at a lower range, and the mid-range version needs to be added with more CPU power than the DVA1622 and more tasks permitted, but at the current DVA1622 price model. I have read a fair amount of reviews from Amazon to online vendors, and most people either regret spending money on the DVA1622, keep it but don’t like its performance, or have problems with the unit.
        REPLY ON YOUTUBE