DataAnalystPro https://www.webpronews.com/technology/dataanalystpro/ Breaking News in Tech, Search, Social, & Business Thu, 30 Jan 2025 21:34:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://i0.wp.com/www.webpronews.com/wp-content/uploads/2020/03/cropped-wpn_siteidentity-7.png?fit=32%2C32&ssl=1 DataAnalystPro https://www.webpronews.com/technology/dataanalystpro/ 32 32 138578674 Unwrap: The AI Revolution in Customer Intelligence https://www.webpronews.com/unwrap-the-ai-revolution-in-customer-intelligence/ Thu, 30 Jan 2025 21:34:23 +0000 https://www.webpronews.com/?p=611343 In the bustling world of data analytics, where every byte of information can be the key to unlocking the next big market insight, a new player has emerged with a promise to redefine how businesses understand their customers. Unwrap, an AI-powered analytics platform, has just secured a significant $12 million in Series A funding, signaling a robust vote of confidence from investors in its mission to transform customer intelligence.

A Fresh Infusion of Capital

This recent funding round was spearheaded by Scale Venture Partners, with notable participation from Atlassian Ventures, Cercano, ScOp VC, and the Allen Institute for Artificial Intelligence, among others. The involvement of figures like David Singleton, former CTO of Stripe, and Johnny Ho, co-founder of Perplexity, underscores the high expectations set on Unwrap’s potential to disrupt the analytics sector.

The capital injection aims to bolster Unwrap’s sales and engineering teams, facilitating further product development and company expansion. Rory O’Driscoll, a partner at Scale Venture Partners, will join Unwrap’s board, bringing his experience from successful ventures like Box and DocuSign.

Understanding Unwrap’s Core Offering

At its heart, Unwrap is about making customer feedback actionable. Unlike many analytics tools that provide raw data or require extensive manual analysis, Unwrap employs AI to dive deep into unstructured customer feedback—be it from surveys, social media, or customer service interactions. Here’s how it stands out:

  • Automated Insight Generation: Unwrap automatically identifies themes, sentiments, and priorities from customer feedback. This automation reduces the manual labor typically associated with understanding customer data, allowing for quicker, more informed decision-making.
  • Intuitive Data Visualization: For data analysts, one of the most compelling features is the platform’s ability to present complex data in easily digestible formats. Through its AI algorithms, Unwrap not only processes but also visualizes data in ways that highlight actionable insights, making it easier for teams to translate customer feedback into product enhancements or strategic pivots.
  • Real-Time Analysis: In today’s fast-paced market, the ability to react in real-time is invaluable. Unwrap offers near-instant analysis, enabling companies to be agile in responding to customer needs or market shifts.

Practical Applications for Data Analysts

For professionals leveraging data analytics tools daily, Unwrap offers several practical benefits:

  • Enhanced Product Development: By understanding customer pain points and desires more clearly, product teams can prioritize features or improvements that resonate most with their audience. This insights-driven approach can significantly reduce time-to-market for new features while ensuring they meet customer expectations.
  • Customer Experience Optimization: Customer service and experience teams can use Unwrap’s insights to tailor interactions, improve service protocols, or even predict customer satisfaction trends. This could lead to higher retention rates and more positive brand advocacy.
  • Market Strategy: For analysts tasked with market analysis, Unwrap’s ability to sift through vast amounts of feedback provides a granular view of market sentiments, helping to craft strategies that are more aligned with actual consumer behavior rather than assumptions.
  • Risk Management: By identifying patterns or sudden shifts in customer feedback, analysts can alert business leaders to potential risks or opportunities, fostering a proactive rather than reactive business culture.

The Future Outlook

With customer intelligence being described by investors as a “forever problem,” Unwrap is positioned not just as a tool for today but as a fundamental part of the analytics ecosystem for years to come. The company’s vision goes beyond merely processing data; it’s about instilling a deeper understanding of customers into every layer of business decision-making.

As Unwrap grows, its integration with other data systems and platforms will be crucial. The feedback from current users, echoed on platforms like X, highlights enthusiasm for its intuitive interface and the depth of insights provided, though some note the learning curve typical of adopting new AI technologies.

AI Central to Understanding Customers

Unwrap’s Series A funding is more than just a financial boost; it’s an endorsement of a vision where AI isn’t just an auxiliary tool but the central engine driving customer understanding. For data analysts, this means entering an era where their work could be less about sifting through data and more about interpreting and acting on insights that are already well-curated by AI. As Unwrap continues to evolve, it’s clear that the future of customer intelligence is not just about collecting feedback but transforming it into strategic action.

]]>
611343
What is a Modern Data Warehouse? https://www.webpronews.com/modern-data-warehouse/ Mon, 30 Dec 2024 07:59:14 +0000 https://www.webpronews.com/?p=610786 Modern data warehouses are centralized repositories that allow businesses to store, integrate, and analyze data from multiple sources. Unlike traditional data warehouses, which often required complex ETL (Extract, Transform, Load) systems and rigid, on-site storage solutions, modern data warehouses are scalable, flexible, and cloud-based.


As businesses generate increasing volumes of structured, semi-structured, and unstructured data, traditional data warehouses have struggled to keep up. Modern data warehouses address this challenge by utilizing cloud technologies, big data architectures, and powerful analytics tools to efficiently store and process vast amounts of data. This enables businesses to eliminate data silos, increase data accessibility, and make faster, data-driven decisions.

Modern data warehouse: Everything you need to know


Organizations are increasingly relying on their data to make wise decisions in today’s fast-paced, data-driven environment. A Power BI consulting company can help businesses effectively integrate their data warehouses with advanced analytics tools like Power BI, enhancing data accessibility and visualization. The development of the contemporary data warehouse has revolutionized business data management, processing, and analysis. Modern data warehouses allow businesses to consolidate vast amounts of data from various sources into a central repository, making it easy to quickly access, evaluate, and derive insights. This article covers everything you need to know about modern data warehouses, including their components, benefits, and how they are transforming the way companies use data.

Key Components of a Contemporary Data Warehouse


A modern data warehouse is composed of several key components that work together to enable data collection, integration, and analysis:

  1. Data Sources
    Data sources form the foundation of any data warehouse. These sources may include internal systems such as operational databases, CRM (Customer Relationship Management) platforms, and ERP (Enterprise Resource Planning) software. Additionally, external sources such as IoT (Internet of Things) devices, social media platforms, and third-party data providers can also feed data into the warehouse.
    Modern data warehouses can handle not only structured data (like transactional data) but also semi-structured data (such as JavaScript Object Notation or XML files) and unstructured data (like images or text from social media).
  2. ETL/ELT Data Integration Layer
    The data integration layer is responsible for gathering data from various sources and transforming it into a format suitable for loading into the warehouse. Typically, the integration process is split into two main approaches: ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform).
    ETL: In the traditional ETL process, data is extracted from the source, transformed into the required format, and then loaded into the data warehouse.
    ELT: In a modern data warehouse, ELT is more common. Here, raw data is loaded into the warehouse first, and the transformation process happens within the warehouse itself using the processing power of the cloud or big data engines.
  3. Storage Layer
    The storage layer is where all of the data in the warehouse is housed. Modern data warehouses typically use cloud-based storage, allowing businesses to scale their storage needs up or down based on the volume of data. Cloud solutions like Microsoft Azure, Google Cloud Storage, or Amazon S3 are commonly used for this purpose.
    By enabling the storage of both historical and real-time data, cloud-based storage helps businesses analyze past trends and make more informed, current decisions.
  4. Data Modeling Layer
    Data modeling is crucial for organizing the data in a way that enables efficient querying and analysis. In this layer, data is structured into logical schemas, tables, and views. This is also where dimensional modeling techniques such as star or snowflake schemas are applied to ensure fast and effective reporting and analysis.
    The data modeling layer guarantees that data is stored optimally, facilitating efficient queries and reducing the processing time required for complex analytics.
  5. Query and Analytics Layer
    Once the data is collected and modeled, the query and analytics layer provides the tools needed to access and analyze the data. This layer typically includes tools for data querying (such as SQL), data processing, and reporting. It is here that powerful analytics engines like Amazon Redshift, Google BigQuery, or Microsoft SQL Server come into play.
    In this layer, businesses often create interactive reports, visualizations, and dashboards using business intelligence (BI) tools like Power BI, Tableau, or Looker. Power BI, in particular, is known for its seamless integration with data warehouses, providing users with real-time insights and easy-to-understand visualizations.
  6. Data Governance and Security Layer
    Data governance is critical to ensure that data is accurate, consistent, and compliant with regulations. This layer manages metadata, enforces data ownership, tracks data lineage, and ensures data security. Modern data warehouses use automated governance tools and policies to monitor data quality and control access.
    Security features like encryption, user authentication, and role-based access control are essential for protecting sensitive corporate data.

Benefits of a Modern Data Warehouse


Adopting a modern data warehouse offers numerous advantages that are transforming business operations and decision-making. Here are the key benefits:

  1. Scalability
    One of the primary benefits of modern data warehouses is scalability. Since most modern data warehouses are cloud-based, companies can scale their storage and processing capabilities up or down depending on their needs. The cloud infrastructure automatically adjusts to accommodate growing data volumes as businesses expand, eliminating the need for major hardware upgrades.
  2. Faster Decision-Making
    Modern data warehouses enable businesses to access and analyze data in real time or near real time. This faster access to insights allows decision-makers to respond quickly to changing market conditions, customer demands, and internal performance metrics. This agility helps companies stay ahead of competitors and make more informed, data-driven decisions.
  3. Data Integration
    One of the challenges businesses face is managing data silos—when data from different departments or systems is stored separately and not easily accessible across the organization. A modern data warehouse integrates data from various sources into a single, unified view of business operations. This promotes collaboration, enhances decision-making, and helps businesses make sense of all their data in one place.
  4. Advanced Analytics and Insights
    Modern data warehouses support advanced analytics techniques that allow businesses to uncover hidden insights in their data. With built-in AI, machine learning, and predictive analytics capabilities, businesses can identify trends, forecast outcomes, and gain insights that drive innovation and efficiency. For example, predictive analytics can be used to forecast customer behavior, optimize supply chains, or predict financial results.
  5. Cost Efficiency
    The cloud-based nature of modern data warehouses makes them cost-effective. Companies no longer need to invest in on-premise infrastructure, which requires significant upfront investment, ongoing maintenance, and IT staffing. Instead, cloud providers offer flexible pricing models based on consumption, so businesses only pay for what they use, making modern data warehouses an affordable solution for companies of all sizes.
  6. Improved Collaboration
    A modern data warehouse facilitates better collaboration between teams by offering a centralized platform where data can be shared and accessed by authorized users across the organization. Tools like Power BI enable employees to collaborate on real-time dashboards, share reports, and ensure that everyone is working with the same up-to-date information.

Challenges of Implementing a Modern Data Warehouse


Despite the clear benefits, there are several challenges associated with implementing and maintaining a modern data warehouse:

  1. Data Quality and Integration
    Ensuring that data is accurate, consistent, and up-to-date is a significant challenge for many businesses. Combining data from various sources and transforming it into a usable format can be time-consuming and complex. Organizations must implement robust data governance policies and use automation tools to ensure data quality.
  2. Complex Setup
    Setting up a modern data warehouse involves designing data models, selecting the appropriate cloud provider, integrating various data sources, and implementing security measures. This process can be complex and may require specialized knowledge of cloud infrastructure, data modeling, and business intelligence tools.
  3. Security Concerns
    With data stored in the cloud, security is a top priority. Organizations must ensure that their data warehouse is protected against unauthorized access and cyber threats. Proper encryption, access control, and regular security audits are necessary to safeguard sensitive business data.

How Power BI Enhances a Modern Data Warehouse


Power BI is a powerful business intelligence tool that enhances the value of a modern data warehouse. By integrating Power BI with a modern data warehouse, businesses can create interactive dashboards, generate real-time reports, and perform advanced analytics on their data.


How Power BI Enhances the Data Warehouse Experience:
Real-Time Data Access: Power BI connects directly to a modern data warehouse, providing businesses with real-time data analysis and visualizations.

Advanced Analytics: Power BI’s integration with AI and machine learning models allows companies to perform advanced analytics and generate predictive insights.

Ease of Use: Power BI’s user-friendly interface allows employees to create reports, explore data, and share insights across the organization.

Conclusion


The modern data warehouse plays a crucial role in today’s data-driven business environment. By consolidating data from multiple sources and providing real-time access to insights, modern data warehouses enable organizations to make data-driven decisions quickly and effectively. Through solutions like Power BI consulting services, businesses can fully leverage their data warehouses, empowering teams with advanced analytics and visualization tools. While implementing and managing a modern data warehouse can present challenges, the benefits far outweigh the obstacles, making it a valuable investment for any organization looking to stay competitive in today’s fast-paced market.

]]>
610786
Your 8-Month Fast-Track to a High-Paying Data Analyst Career Revealed https://www.webpronews.com/your-8-month-fast-track-to-a-high-paying-data-analyst-career-revealed/ Mon, 04 Nov 2024 17:55:26 +0000 https://www.webpronews.com/?p=606374 In the rapidly evolving world of technology, the role of a data analyst has become increasingly vital across industries. For those aspiring to enter this dynamic field, understanding the essential skills and the pathway to mastery is crucial. The Programming with Mosh YouTube channel recently outlined a comprehensive roadmap for becoming a data analyst in 2024, offering a step-by-step guide that anyone can follow to break into the field within 8 to 16 months.

The Foundation: Mathematics and Statistics

A strong foundation in mathematics and statistics is the cornerstone of data analysis. This area is critical because it provides the analytical tools necessary to interpret data effectively. Aspiring data analysts should focus on mastering concepts such as mean, median, standard deviation, probability, and hypothesis testing. Spending one to two months on these topics will help build the essential analytical mindset needed for data-driven decision-making.

Mastering Excel: The Industry Staple

Excel remains a fundamental tool in the data analyst’s toolkit. Despite the rise of more advanced software, many companies continue to rely on Excel for data manipulation and analysis. Analysts are expected to be proficient in functions, pivot tables, and charts, which are indispensable for organizing and interpreting data. A solid grasp of Excel can be achieved with two to three weeks of dedicated practice, making it a critical step early in the learning journey.

SQL and Python: The Languages of Data

SQL (Structured Query Language) is the backbone of database management, allowing analysts to query and manage large datasets efficiently. Learning SQL is relatively straightforward, and within one to two months, most individuals can acquire the skills needed to manipulate databases effectively.

Python, on the other hand, is a versatile programming language widely used in data analysis. It is particularly powerful when paired with libraries like Pandas and NumPy, which simplify data manipulation and analysis. Python also serves as a gateway to more advanced topics, such as machine learning. Beginners are encouraged to spend one to two months learning Python before potentially branching out into R, another language popular among data analysts.

Data Collection, Preprocessing, and Visualization

Collecting and cleaning data is a crucial step in any analysis. Data often comes from various sources and requires preprocessing to be useful. Tools like Python’s Pandas library are essential for this phase, which typically takes one to two months to master.

Once the data is prepped, visualization becomes key. Effective data visualization helps uncover patterns and communicate insights clearly. Python libraries such as Matplotlib and Seaborn, along with business intelligence tools like Tableau and PowerBI, are indispensable for this purpose. One to two months of practice in data visualization tools will allow analysts to create compelling and informative visuals that support business decisions.

Advanced Topics: Machine Learning and Big Data

While not mandatory for every data analyst role, understanding the basics of machine learning can provide a competitive edge. Machine learning enables computers to make predictions based on data, and knowledge of this area is increasingly sought after. Spending a month or two learning machine learning fundamentals, including tools like TensorFlow and Scikit-learn, can be beneficial.

In addition, as datasets grow in size, the ability to handle big data becomes important. Tools like Hadoop and Spark are designed to process massive amounts of data efficiently. Familiarity with these tools, which can be acquired in one to two months, is increasingly valuable in a world where big data plays a central role in business operations.

A Path to Success in Just 8 to 16 Months

The roadmap provided by Programming with Mosh suggests that with a commitment of three to five hours per day, an aspiring data analyst can acquire all the necessary skills within 8 to 16 months. By following this structured approach, individuals can position themselves for success in one of the most in-demand fields today. Whether starting from scratch or upskilling, this guide offers a clear and practical pathway to a rewarding career in data analysis.

]]>
606374
How to Become a Data Analyst in 2024: Embracing AI and Core Skills https://www.webpronews.com/how-to-become-a-data-analyst-in-2024-embracing-ai-and-core-skills/ Sat, 02 Nov 2024 20:41:57 +0000 https://www.webpronews.com/?p=605637 In an era marked by rapid technological advancements, the role of a data analyst is evolving at an unprecedented pace. Luke Barousse, a seasoned data analyst, and YouTube content creator, offers a comprehensive guide on how to become a data analyst in 2024. Drawing from his diverse experiences in corporate America and working for top-tier influencers like Mr. Beast, Barousse shares invaluable insights into the tools and skills necessary for this dynamic field.

Core Skills for Aspiring Data Analysts

Before diving into the latest AI tools, Barousse emphasizes the importance of mastering core skills that remain essential in the data analytics landscape. “SQL, or SQL as many call it, tops the list,” he notes. This programming language is crucial for communicating with databases, a fundamental aspect of data analysis. According to Barousse, SQL is mentioned in almost half of all job postings for data analysts, underscoring its significance.

Excel, the ubiquitous spreadsheet software, follows closely. Despite its intended use for ad-hoc analysis, many companies rely heavily on Excel for complex data tasks. “Excel is in about a third of all job postings, which speaks to its continued relevance,” Barousse adds.

When it comes to programming languages, Python and R are prominent. Barousse highlights Python’s versatility, making it suitable for tasks ranging from advanced analytics to machine learning. “Python is nearly as popular as Excel, appearing in almost a third of job postings,” he points out. R, while more specialized, remains a valuable tool for statistical analysis, though it’s less commonly required than Python.

Visualization tools such as Tableau and Power BI are also critical. These tools enable data analysts to create interactive dashboards and visualizations, aiding non-technical stakeholders in understanding complex data insights. “I’ve spent weeks building dashboards that help my colleagues make data-driven decisions,” Barousse shares.

AI Revolution: Transforming Data Analysis

The landscape of data analysis is being reshaped by AI, lowering the barrier to entry and enhancing efficiency. Barousse reflects on his experience building a data analyst portfolio without writing a single line of code, thanks to advancements in AI tools. “The barrier to entry to become a data analyst and actually analyze data is getting lower and lower,” he asserts.

One significant development is the integration of AI into SQL workflows. Barousse uses GitHub Copilot, an AI coding assistant, to speed up query writing and improve efficiency. “Copilot can autocomplete queries and answer questions about SQL syntax, but I’m exploring other tools that might offer even more capabilities,” he says.

Microsoft Excel has also seen transformative updates. The introduction of Microsoft 365 Copilot, which leverages OpenAI’s technology, allows users to ask questions about their data and receive insights directly within Excel. Another major feature is the integration of Python, enabling advanced calculations and analysis within the familiar Excel environment. “These updates make Excel more powerful than ever, bridging the gap between traditional spreadsheets and modern data analysis tools,” Barousse explains.

The Importance of Learning Python

For those starting their journey as data analysts, Barousse recommends Python as the go-to programming language. “Python is a multipurpose language that can handle a wide range of tasks, from data scraping to building web applications,” he says. He also notes that AI coding assistants like GitHub Copilot and Google’s Duet AI can help learners quickly grasp Python by providing real-time feedback and code suggestions.

Visualization Tools: Power BI vs. Tableau

When it comes to visualization tools, Barousse has a preference for Power BI due to its integration with Power Query and DAX functionality. “Power BI makes it easier to clean and analyze data, though Tableau excels in community support and sharing capabilities,” he explains. Both tools have received AI enhancements, with Power BI incorporating a basic version of Copilot and Tableau developing its own AI features under Salesforce’s Einstein Analytics.

AI Assistants and Job Security

A common concern among data analysts is whether AI will replace their jobs. Barousse addresses this by citing a KPMG survey, which found that over half of business leaders expect AI to expand their workforce rather than shrink it. “AI is designed to assist, not replace, data analysts. It enhances productivity and allows us to focus on more complex, value-added tasks,” he emphasizes.

Supporting this view, a Harvard study revealed that consultants using AI were significantly more productive and produced higher quality results compared to those who didn’t use AI. “The data is clear: AI is here to improve our jobs, not take them away,” Barousse concludes.

As Barousse navigates the transformative landscape of data analysis, he remains optimistic about the future. With AI tools streamlining workflows and enhancing capabilities, the role of a data analyst is more dynamic and exciting than ever. For those entering the field, embracing these advancements while mastering core skills is key to thriving in this evolving profession.

]]>
605637
A Day in the Life of a Data Analyst at AWS https://www.webpronews.com/a-day-in-the-life-of-a-data-analyst-at-aws/ Fri, 01 Nov 2024 20:26:07 +0000 https://www.webpronews.com/?p=605634 As the clock strikes 9:45 AM, Agatha Kang, a Business Intelligence Engineer at Amazon Web Services (AWS), begins her day. The office is still quiet, a perfect setting for what promises to be a busy day filled with operational meetings and data projects. Kang’s role at AWS is multifaceted, encompassing responsibilities of a business analyst, data engineer, and data analyst. “At AWS, our goal is to track metrics and see how the business is performing, and all of this is done through data,” she explains.

Kang’s journey into the tech world is as intriguing as her current role. She spent six years as a data analyst in healthcare before making a significant career pivot. “I loved mentoring people when I got promoted to manager of data analytics in healthcare, but I missed being hands-on with data,” Kang shares. “Healthcare operates at a slower pace, and I wanted more of a challenge, which I found at Amazon. The pace here is a lot faster, and the work is very challenging, but that’s what I enjoy.”

The Experiment Begins

Kang’s typical day involves a heavy use of SQL along with tools like Excel, Python, and BI visualization tools. Her primary collaborators are the planning teams, who rely on her to build data solutions that automate their manual work and enable data-driven decisions. “Most of my day is spent using SQL, but I also use other tools like QuickSight and cloud technologies. My work helps the planning teams make better decisions based on data,” Kang explains.

At 10:00 AM, Kang joins an operational meeting where her team discusses on-call tickets and any ad hoc work that needs to be addressed. These meetings are vital for brainstorming and sharing ideas on how to improve their data processes. “Working with smart people who have more data experience than I did initially was challenging, but I’ve learned so much from my team,” Kang says. “These operational meetings are super helpful for us to collaborate and find better ways to build our solutions.”

Broadening Horizons: Sectors Embracing AI

Kang’s work spans various sectors, reflecting the broad applicability of AI and data analytics. “In my role, I’ve worked on projects for different industries including healthcare, manufacturing, aerospace, telecommunications, and consumer packaged goods,” Kang notes. This diversity keeps her job interesting and continuously pushes her to learn new technologies. “Being in tech means dealing with complex problems, and that’s what makes it so rewarding. Each sector has its unique challenges, but the principles of data analysis remain the same.”

Federal Government: A Major Growth Driver

Public sector projects have become a significant part of Kang’s portfolio, especially with the federal government’s increasing reliance on data analytics for efficiency and optimization. “The federal government has doubled its investment in AI and data analytics, particularly in defense and intelligence,” Kang explains. “We work on projects that optimize logistics, supply chains, and even predictive maintenance for aircraft. The goal is to leverage data to enhance operational efficiency and decision-making capabilities.”

Enterprise Software and AI: A Synergistic Relationship

The integration of AI with enterprise software has transformed how businesses operate, and Kang’s role at AWS places her at the forefront of this evolution. “Enterprise software is a massive industry, and traditional technology stacks are being revolutionized by AI,” Kang states. “These AI-driven applications allow companies to predict customer behavior, prevent supply chain disruptions, and maintain complex infrastructures like the power grid or oil and gas networks.”

Adapting to New Pricing Models

Kang’s team has also adapted to new business models to meet customer needs better. “We transitioned from a subscription model to a pay-as-you-go model a couple of years ago,” she explains. “This change has increased our revenue growth because it allows clients to scale their usage based on their needs. It’s more flexible and aligns better with how businesses want to manage their costs.”

Navigating a Transformative Landscape

As the day winds down, Kang reflects on her decision to switch from healthcare to tech and the journey so far. “I’ve never looked back since making the move. The fast-paced environment and the opportunity to solve complex problems with smart people make every day exciting,” she says. For those considering a career in tech, Kang offers this advice: “Embrace the challenges and keep learning. The tech industry is constantly evolving, and staying curious is key to success.”

Kang’s day may end at the office, but her passion for data and technology continues to drive her forward. “The work we do at AWS not only impacts the company but also shapes the future of data analytics and AI. It’s a privilege to be part of such a transformative journey.”

]]>
605634
Unlocking the Power of AI Tools in Data Analysis https://www.webpronews.com/unlocking-the-power-of-ai-tools-in-data-analysis/ Wed, 30 Oct 2024 20:13:54 +0000 https://www.webpronews.com/?p=605631 Artificial intelligence (AI) has become indispensable for extracting meaningful insights from complex datasets in the ever-evolving realm of data science. YouTube content creator and AI enthusiast Andy Stapleton recently delved into the capabilities of three prominent AI tools—Julius AI, Vizly, and the latest version of ChatGPT. His goal was to determine which tool excels in data analysis and to understand their limitations. Here’s what he discovered.

The Experiment Begins

Stapleton began his experiment with a straightforward dataset: public healthcare data. He input the same data into each AI tool and issued the same prompt: “This is public healthcare data. Provide some insights into what this data shows, including graphs or other visualizations that you think will help.”

Julius AI was the first to be tested. It quickly generated Python code to analyze the dataset, producing visualizations that included the distribution of hospital codes, admission types, severity of illness, and lengths of stay. “Julius AI provided a comprehensive initial analysis,” Stapleton noted. “It’s clear that it’s capable of handling large datasets and generating useful insights efficiently.”

Comparing the Tools

Next, Stapleton tested Vizly with the same dataset and prompt. Vizly produced similar visualizations but offered a unique summary of the public healthcare data analysis. “Vizly chose slightly different parameters for its analysis,” Stapleton observed. “One notable feature was its interactive graphs, which allow users to hover over data points for additional information.”

Finally, Stapleton turned to ChatGPT’s latest version. ChatGPT not only generated visualizations but also provided an analysis plan and interactive graphs. “The interactivity of ChatGPT’s visualizations sets it apart,” Stapleton said. “You can explore the data in a more dynamic way, which is incredibly valuable for deeper analysis.”

Diving Deeper

Stapleton then asked each tool to provide a breakdown of the distribution of hospital stays by duration. All three tools performed admirably, but Vizly’s interactive capabilities again stood out. “Vizly’s graph was the most user-friendly,” Stapleton remarked. “It allowed for zooming and detailed exploration of the data.”

For his next test, Stapleton introduced a more challenging dataset from his PhD research on organic photovoltaic (OPV) devices. This dataset was unstructured, containing metadata and raw data. Julius AI impressed by correctly identifying and plotting the IV curve of the OPV device, despite the complexity of the data. “Julius AI’s ability to self-correct and find the necessary data was impressive,” Stapleton said.

Vizly struggled initially but eventually managed to identify the IV curve data after several attempts. ChatGPT, however, quickly processed the unstructured data and accurately plotted the IV curve, even calculating the efficiency of the OPV device. “ChatGPT’s reasoning capabilities are superior,” Stapleton concluded. “It can handle complex datasets with ease.”

Testing the Limits

To push the boundaries further, Stapleton tested the AI tools with an image of silver nanowires and single-walled carbon nanotubes. Julius AI and Vizly both attempted edge detection but provided varying results. ChatGPT, while unable to directly measure the nanowires’ diameter, offered valuable guidance on using other tools like ImageJ for precise measurement. “ChatGPT’s ability to provide actionable advice is a significant advantage,” Stapleton noted.

Final Thoughts

After extensive testing, Stapleton found that both Julius AI and ChatGPT stood out as the most effective tools for data analysis. “For anyone working with large and complex datasets, Julius AI and ChatGPT are invaluable,” he said. “They complement each other perfectly, making data analysis more accessible and efficient than ever before.”

Stapleton’s deep dive into AI tools for data analysis highlights the transformative potential of these technologies. As AI continues to advance, tools like Julius AI, Vizly, and ChatGPT will play a crucial role in helping researchers, analysts, and businesses unlock new insights from their data.

]]>
605631
Rifi’s Groundbreaking Approach to Data Observability and Rapid Issue Detection https://www.webpronews.com/rifis-groundbreaking-approach-to-data-observability-and-rapid-issue-detection/ Sat, 06 Jul 2024 11:17:18 +0000 https://www.webpronews.com/?p=601172 In today’s fast-paced digital landscape, harnessing and understanding data is paramount for success. Enter Rifi, a company at the forefront of data observability and rapid issue detection. In a recent episode of “Taking Stock,” a reporter sat down with Rifi’s CEO, Sanjay Agrawal, to delve into the innovative features and user experiences that set Rifi apart in the world of data operations.

Rifi’s platform, housed in the cloud, empowers data teams to stay ahead of the curve by offering unparalleled visibility into their operations. Sanjay highlights two key features that define Rifi’s offering. Firstly, the platform helps teams manage their budgets effectively, ensuring they don’t exceed allocations for cloud services like Snowflake or BigQuery. This proactive approach to cost management saves money and fosters trust within organizations, as data flows smoothly and reliably.

Secondly, Rifi prioritizes the time of data teams, recognizing that efficiency is crucial for building trust and making informed decisions. Sanjay notes that Rifi has enabled some customers to drastically reduce escalations from data teams to their businesses, a testament to the platform’s ability to streamline operations and increase productivity.

One of Rifi’s standout success stories involves a public company with a $10 billion market cap. Within just three weeks of implementing Rifi’s solution on BigQuery, the company noticed a significant increase in failed jobs, indicating issues with data accessibility. Instead of resorting to the traditional approach of requesting more resources, Rifi’s platform enabled the company to identify the root cause of the problem quickly: certain user and query patterns consuming excessive capacity. By addressing these issues promptly, the company was able to free up nearly a quarter-million dollars worth of capacity, demonstrating the tangible impact of Rifi’s technology on the bottom line.

When asked about Rifi’s approach to innovation, Sanjay emphasizes the company’s commitment to listening to its customers. With clients spanning various industries, including public, healthcare, finance, and startups, Rifi understands the diverse needs and challenges facing data teams today. By staying attuned to customer feedback and continuously iterating on its platform, Rifi ensures that it remains at the forefront of innovation, delivering solutions that meet the dynamic demands of the modern tech landscape.

In conclusion, Rifi’s groundbreaking approach to data observability and rapid issue detection is revolutionizing the way organizations harness and leverage their data. By combining cutting-edge technology with a customer-centric approach, Rifi empowers data teams to navigate the complexities of today’s digital world with confidence and agility.

]]>
601172
US Agencies Request the Most User Data From Big Tech, Apple Complies the Most https://www.webpronews.com/us-agencies-request-the-most-user-data-from-big-tech-apple-complies-the-most-2/ Thu, 04 Jul 2024 17:07:05 +0000 https://www.webpronews.com/?p=522547 Americans concerned about their user data falling into the hands of foreign governments may want to look closer to home.

According to new research by VPN provider SurfShark, the US government makes the most requests for user data from Big Tech companies than any other jurisdiction in the world. The company analyzed data requests to Apple, Google, Meta, and Microsoft by “government agencies of 177 countries between 2013 and 2021.”

The US came in first with 2,451,077 account requests, more than four times the number of Germany, the number two country on the list. In fact, the US made more requests than all of Europe, including the UK, which collectively came in under 2 million.

While the US and EU were responsible for a combined total of 60% of all data requests, the US “made 8 times more requests than the global average (87.9/100k).”

The number of accounts being accessed is also growing, with a five-times increase in requests from 2013 to 2021. The US alone saw a 348% increase during the time frame, and the scope and purpose of the requests are expanding.

“Besides requesting data from technology companies, authorities are now exploring more ways to monitor and tackle crime through online services. For instance, the EU is considering a regulation that would require internet service providers to detect, report, and remove abuse-related content,” says Gabriele Kaveckyte, Privacy Counsel at Surfshark. “On one hand, introducing such new measures could help solve serious criminal cases, but civil society organizations expressed their concerns of encouraging surveillance techniques which may later be used, for example, to track down political rivals.”

The report also sheds light on which companies comply the most versus which ones push back against requests. For all of its privacy-oriented marketing — “what happens on your iPhone stays on your iPhone” — Apple complies with data requests more than any other company, handing it over 82% of the time.

In contrast, Meta complies 72% of the time, and Google does 71% of the time. Microsoft, on the other hand, pushes back the most among Big Tech companies, only handing data over 68% of the time.

The findings may also put a dent in US efforts to ban TikTok and other foreign apps under the guise of protecting user privacy and data.

]]>
588613
One-Third of Organizations Struggle With Data Loss Prevention Systems https://www.webpronews.com/one-third-of-organizations-struggle-with-data-loss-prevention-systems-2/ Tue, 02 Jul 2024 01:58:08 +0000 https://www.webpronews.com/?p=522427 The Cloud Security Alliance (CSA) has bad news for the industry, saying that nearly one-third of organizations struggle with data loss prevention (DLP) systems.

The CSA is an organization dedicated to helping secure cloud computing. A survey the organization conducted with Netskope found that DLP solutions are a critical component used in cloud security.

Unfortunately, that’s where the good news ends. While companies are relying on DLP systems, nearly a third struggle to use them effectively.

Among the top challenges cited by organizations are management difficulties (29%), too many false positives (19%), the need for manual version upgrades (18%), and deployment complexity (15%).

“DLP solutions are an integral part of organizations’ data security strategy, but leaders are still struggling with this strategy and the implementation of solutions, especially for how complicated legacy and on-prem based solutions are to manage and maintain,” said Naveen Palavalli, Vice President of Products, Netskope. “These findings highlight the need for a comprehensive and easy-to-use cloud delivered data protection solution that integrates into their existing security controls and is a key tenant of their Zero Trust security strategy.”

Cloud security is increasingly in the spotlight as more and more organizations experience data breaches at a time when the cloud is becoming integral to more companies and industries.

The Biden administration has signaled it is preparing to regulate cloud security in an effort to better protect organizations. If the CSA’s findings are any indication, it looks like the industry could use the help.

]]>
588612
Microsoft Combines the Power of Python and Excel https://www.webpronews.com/microsoft-combines-the-power-of-python-and-excel/ Tue, 25 Jun 2024 14:54:25 +0000 https://www.webpronews.com/?p=598343 Microsoft has given Excel a major upgrade, unveiling Python in Excel to give users access to the power of Python’s data tools.

Python is one of the most popular languages for data processing and analytics, thanks to its ease of use, versatility, and powerful features. Microsoft is now giving users the ability to leverage that power with a Public Preview of Python in Excel, according to a company blog post:

Now you can do advanced data analysis in the familiar Excel environment by accessing Python directly from the Excel ribbon. No set up or installation is required. Using Excel’s built-in connectors and Power Query, you can easily bring external data into Python in Excel workflows.

We’re partnering with Anaconda, a leading enterprise grade Python repository used by tens of millions of data practitioners worldwide. Python in Excel leverages Anaconda Distribution for Python running in Azure, which includes the most popular Python libraries such as pandas for data manipulation, statsmodels for advanced statistical modeling, and Matplotlib and seaborn for data visualization.

The feature is already gaining fans among customers.

“The ability to run Python in Excel simplifies McKinney’s reporting workflows. We used to manipulate data structures, filter, and aggregate data in a Jupyter Notebook, and build visuals in Excel,” said Greg Barnes, McKinney Executive Director of Data and Analytics.  “Now we can manage the entire workflow in Excel. This is going to make Excel that much more powerful and make Python more accessible across the organization. Python support is the most exciting update for Excel in my career!”

]]>
598343
Why is Data Analytics in Healthcare so Important? https://www.webpronews.com/data-analytics-in-healthcare/ Mon, 06 May 2024 20:14:34 +0000 https://www.webpronews.com/?p=516604 The healthcare system is constantly faced with the challenges of effectively using large amounts of data. Medical companies are facing security issues and the risk of data breach. Therefore, one way out is to set up the medical data analysis software.

Medical data analysis — what is it?

Information about each patient and the population of the entire country helps not only to extend the life of a person and improve its quality, but also to improve the results of treatment through improved procedures, reducing the volume of medical waste.

Medical analytics has the capacity for reducing the cost of treatment, predicting outbreaks of epidemics, early screening of certain diseases, improving the quality of life in general, and introducing modern methods of treatment into practice. Medical staff are collecting huge amounts of data today, and they need the tools to use these numbers.

How important is the analysis of medical data?

Methods and application of machine learning make it possible to analyze huge amounts of information about the immune status of a particular person.

Using the data you can:

  • plan medical care for people and predict the course of diseases;
  • identify and implement the most effective measures decreasing the number of hospital readmissions;
  • reduce the risk of blood poisoning and kidney failure, intervene at an early stage avoiding negative consequences;
  • optimize outcome management and costs of medicines;
  • develop tools to improve the quality of patient care.

Personalized medicine is focused on treatment decisions based on all information about the patient. To do this, more and more data will need to be processed in the future. For example, each person’s “genetic blueprint”, DNA, will need to be checked for genetic changes.

Benefits of data analytics in medicine

Technological development makes it possible to process small and large amounts of data, to study rare diseases. This is the exclusivity and originality of data analysis.

The development of the necessary technologies helps to implement the results of analyses in the work of a particular doctor and patient. The doctor receives a computer program where the data of his patients are collected. He can see on the monitor the values ​​of medical indicators of patients from his past practice, which are closest to indicators of his new patient being studied at the moment. It allows identifying similar cases and optimizing the treatment regimen.

Analyzing information about how a patient adheres to the doctor’s instructions after discharge from the hospital will help the medical institution to predict the hospital readmission within several months and take appropriate measures.

The study of the patient’s condition data can improve his treatment

Data science plays a key role in monitoring patient health and informing physicians of options to prevent potential problems. Specialists use powerful predictive tools for early detection of chronic and systemic diseases.

Data processing algorithms also help to model exactly how medicines will act on the human body. It allows companies to reduce laboratory experiments, costs, and develop innovative medicines for the treatment of serious diseases.

It is important to take into account the specific challenges in the healthcare system as it involves the collection and analysis of sensitive patient data. It is also very important to understand that the value of digital infrastructures is in the intelligent, controlled use of data for the benefit of the individual and society in general.

]]>
516604
Google Cloud Unveils New Tools to Unify Data https://www.webpronews.com/google-cloud-unveils-new-tools-to-unify-data/ Thu, 25 Apr 2024 21:36:55 +0000 https://www.webpronews.com/?p=510655

Google Cloud has unveiled its latest innovations, aimed at helping companies unify database, analytics and AI.

Google Cloud is the third leading cloud provider, behind AWS and Microsoft Azure. The company is particularly viewed as a good option for machine learning development, and has strong support for open source software.

The company’s latest tools will go a long way toward improving its stand even further, with Dataplex, Datastream and Analytics Hub.

Dataplex is designed to “centrally manage, monitor and govern your data across data lakes, data warehouses and data marts, and make this data securely accessible to a variety of analytics and data science tools.”

Datastream, currently available in preview, helps “move and synchronize data between heterogeneous databases, storage and applications reliably to support real-time analytics, database replication and event-driven architectures with Datastream, our serverless change data capture (CDC) and replication service.”

Analytics Hub is designed to make it easy to “access and share valuable datasets and analytics assets (think BigQuery ML models, Looker Blocks, data quality recipes, etc.) across any organizational boundary.” Those interested will need to sign up for preview access.

The company’s latest tools should go a long way toward helping its customers make the most of their data, as well as AI applications.

]]>
510655
Former VP Says Salesforce Is Lying About Salesforce Genie Capabilities https://www.webpronews.com/former-vp-says-salesforce-is-lying-about-salesforce-genie-capabilities/ Mon, 14 Aug 2023 13:12:02 +0000 https://www.webpronews.com/?p=591793 Karl Wirth, a former Salesforce Senior VP, has sued the company, claiming it is lying about its Salesforce Genie capabilities.

When Salesforce introduced Genie, it touted the platform’s ability to process customer data and provide insights in real-time:

“When milliseconds matter most, your healthcare provider can deliver proactive guidance and care recommendations with access to your real-time patient data,” the company wrote shortly after its release.

Unfortunately, according to Wirth’s lawsuit, “it was all a lie.” The lawsuit alleges that much of Genie’s supposedly “real-time” functionality didn’t live up to the hype and “in fact many of its processes took several hours.”

According to Business Insider, Wirth raised concerns about the platform’s performance, which he says were ignored.

“Plaintiff reasonably believed that publicly claiming the CDP operated in ‘real-time’ without actually having (or even intending to have in the near future) such a capability would be fraudulent, and likely violate numerous provisions of Federal law relating to fraud against shareholders, as well as rules and regulations of the Securities and Exchange Commission,” the lawsuit stated.

Wirth’s lawsuit claims that Lidiane Jones, now Slack CEO, engaged in a “deceitful campaign to diminish” Wirth’s reputation within the company, in response to his concerns. When he took his concerns to CTO Parker Harris, Wirth was fired within hours.

With’s lawsuit alleges “whistleblower retaliation” and seeks monetary damages.

]]>
591793
Microsoft Doesn’t Want Employees Sharing Sensitive Data With ChatGPT https://www.webpronews.com/microsoft-doesnt-want-employees-sharing-sensitive-data-with-chatgpt-2/ Sat, 03 Jun 2023 18:08:15 +0000 https://www.webpronews.com/?p=521491 Microsoft may be going all-in on OpenAI tech and ChatGPT, but that doesn’t mean the company wants sensitive information shared with it.

Microsoft is rolling out ChatGPT across multiple products and has no objection to its own employees using the tech. However, the company wants to make sure no sensitive information is shared with the AI.

“Please don’t send sensitive data to an OpenAI endpoint, as they may use it for training future models,” a senior engineer wrote in an internal post that was reviewed by Business Insider.

The memo demonstrates one of the biggest challenges moving forward with large language model AIs, namely controlling what information it has access to, and how that information will be used if it is shared.

ChatGPT is a conversational AI that learns from its interactions and what people type into it. As such, it’s not surprising that Microsoft wants to make sure no sensitive information is shared with it, since the AI could then end up using that information in its responses to users.

“Human beings sign NDAs and consequently have incentives to be careful in how they share information. But large language models such as ChatGPT do not have the ability to reason about such issues, at least by default,” Vincent Conitzer, Carnegie Mellon University computer science professor and director of its AI lab, told Insider.

Microsoft’s caution is one other companies would do well to imitate.

]]>
588698
Looker Comes to Google Cloud Console https://www.webpronews.com/looker-comes-to-google-cloud-console/ Tue, 16 May 2023 10:46:01 +0000 https://www.webpronews.com/?p=523558 Google says its Looker insight tool is now accessible via the Google Cloud console, streamlining organizations’ access to business intelligence.

Google is calling the new service “Looker (Google Cloud core)” — no one ever accused engineers of being good at naming things — and comes in Standard and Enterprise editions, as well as a dedicated Embed edition.

Integrating Looker into Google Cloud console brings a number of benefits, according to the company:

Looker (Google Cloud core) offers organizations a fresh, consistent real-time view of their business data, and extends the benefits that a commissioned study by Forrester Consulting on behalf of Google – “The Total Economic Impact™ Of Google BigQuery and Looker” (April 2023) says leads to an ROI of greater than 200%, while bringing the offering closer to Google Cloud’s array of leading products. This new offering builds upon the semantic modeling and data exploration capabilities Looker has been known for over the last decade and adds expanded security options, Google Cloud integrations, and instance management features.

Being part of Google Cloud console also means that businesses can test drive Google’s business intelligence solutions at no cost for 30 days.

]]>
523558
Microsoft Acquires Fungible to Improve Its Data Centers https://www.webpronews.com/microsoft-acquires-fungible-to-improve-its-data-centers-2/ Mon, 15 May 2023 01:00:59 +0000 https://www.webpronews.com/?p=521033 Microsoft has announced its acquisition of Fungible, a company that produces data processing units (DPUs) used in data centers.

Microsoft Azure is the second-largest cloud computing platform behind AWS. Microsoft clearly wants to improve its data center offerings, and sees Fungible as a way to achieve that.

“Fungible’s technologies help enable high-performance, scalable, disaggregated, scaled-out datacenter infrastructure with reliability and security,” writes Girish Bablani, Corporate Vice President, Azure Core.

“The Fungible team will join Microsoft’s datacenter infrastructure engineering teams and will focus on delivering multiple DPU solutions, network innovation and hardware systems advancements.”

Microsoft sees Fungible as a long-term investment that will help it differentiate its offerings.

“Today’s announcement further signals Microsoft’s commitment to long-term differentiated investments in our datacenter infrastructure, which enhances our broad range of technologies and offerings including offloading, improving latency, increasing datacenter server density, optimizing energy efficiency and reducing costs,” Bablani adds.

No financial terms of the acquisition were revealed.

]]>
588686
FTC’s ‘Blanket Prohibition’ Would Prohibit Facebook From Profiting Off of Youth Data https://www.webpronews.com/ftcs-blanket-prohibition-would-prohibit-facebook-from-profiting-off-of-youth-data/ Sun, 14 May 2023 01:02:43 +0000 https://www.webpronews.com/?p=523508 The Federal Trade Commission is proposing new protections that would prohibit Facebook from profiting off of youth data.

Facebook and Meta have come under growing criticism for the impact they have on young ones. Unfortunately, young people represent an important market, meaning that Facebook and other social media companies are strongly incentivized to profit from young people’s activities.

The FTC wants to put a stop to it, and is proposing new changes to the 2020 privacy order between the agency and Facebook.

“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”

The agency’s new proposals would prohibit Facebook from monetizing youth data in any way:

As part of the proposed changes, Meta, which changed its name from Facebook in October 2021, would be prohibited from profiting from data it collects, including through its virtual reality products, from users under the age of 18. It would also be subject to other expanded limitations, including in its use of facial recognition technology, and required to provide additional protections for users.

The FTC outlined five changes to the 2020 order, changes that would impact all of Facebook’s services:

  • Blanket prohibition against monetizing data of children and teens under 18: Meta and all its related entities would be restricted in how they use the data they collect from children and teens. The company could only collect and use such data to provide the services or for security purposes, and would be prohibited from monetizing this data or otherwise using it for commercial gain even after those users turn 18.
  • Pause on the launch of new products, services: The company would be prohibited from releasing new or modified products, services, or features without written confirmation from the assessor that its privacy program is in full compliance with the order’s requirements and presents no material gaps or weaknesses.
  • Extension of compliance to merged companies: Meta would be required to ensure compliance with the FTC order for any companies it acquires or merges with, and to honor those companies’ prior privacy commitments.
  • Limits on future uses of facial recognition technology: Meta would be required to disclose and obtain users’ affirmative consent for any future uses of facial recognition technology. The change would expand the limits on the use of facial recognition technology included in the 2020 order.
  • Strengthening existing requirements: Some privacy program provisions in the 2020 order would be strengthened, such as those related to privacy review, third-party monitoring, data inventory and access controls, and employee training. Meta’s reporting obligations also would be expanded to include its own violations of its commitments.

Meta has 30 days to respond to the FTC’s proposals.

]]>
523508
Windows 11 Sends Massive Amounts of Data to Ad Companies https://www.webpronews.com/windows-11-sends-massive-amounts-of-data-to-ad-companies-2/ Fri, 12 May 2023 12:00:00 +0000 https://www.webpronews.com/?p=521702 The PC Security Channel (TPSC) analyzed Windows 11 and found it sends massive amounts of user data to Microsoft, as well as third-party ad companies.

TPSC is a YouTube channel dedicated to cybersecurity and privacy. The channel took a brand-new laptop that had never been used and used Wireshark to monitor the computer’s traffic, starting from the moment it was booted up.

Unsurprisingly, the computer immediately connected to a number of Microsoft services, including Bing, MSN, and the Windows Update service. While it’s not surprising a Windows machine would connect to Microsoft, it is surprising that the Bing traffic was happening without the web browser ever being opened or used.

Even more surprising, Windows 11 also connected to McAfee, Steam, and Comscore’s ScorecardResearch.com, to name just a few. The last one is particularly alarming, as it is an ad-tech company. In fact, when TPSC first tried going to the website to see what ScorecardResearch.com was, the channel’s browser adblocker would not even load the page since it is a known ad and tracking domain.

To make matters worse, Microsoft connects and sends data to these servers without expressly asking the user’s permission. Instead, the company relies on a vague clause in the Microsoft License Terms to constitute permission.

Privacy; Consent to Use of Data. Your privacy is important to us. Some of the software features send or receive information when using those features. Many of these features can be switched off in the user interface, or you can choose not to use them. By accepting this agreement and using the software you agree that Microsoft may collect, use, and disclose the information as described in the Microsoft Privacy Statement (aka.ms/privacy), and as may be described in the user interface associated with the software features.

Tom’s Hardware reached out to Microsoft and was given the following statement:

“As with any modern operating system, users can expect to see data flowing to help them remain secure, up to date, and keep the system working as anticipated,” a Microsoft spokesperson said. “We are committed to transparency and regularly publish information about the data we collect to empower customers to be more informed about their privacy.”

A legitimate case can be made for Windows 11 connecting to Microsoft services, but there is absolutely no valid justification for connecting to and sending telemetry to an ad-tech company.

Interestingly, TPSC ran the same test with Windows XP and found that it only connected to Microsoft update servers, greatly undermining Microsoft’s claim that Windows 11’s connections to third parties were necessary to “remain secure, up to date, and keep the system working as anticipated.”

As we have stated at WPN many times, there is NO EXCUSE for a company that charges handsomely for a product to then turn around and try to monetize its customers’ data, let alone try to do so without express and explicit permission. And no, a couple of sentences buried in a long, legalese licensing document that few people will ever read does not count as express and explicit permission.

Microsoft should be ashamed of itself for this behavior, and one can only hope this revelation will put the companies in the crosshairs of the EU’s GDPR.

In the meantime, TPSC’s question, “Has Windows become spyware?” is one that deserves an answer.

]]>
588675
Nate Silver and FiveThirtyEight Staff Leaving Disney https://www.webpronews.com/nate-silver-and-fivethirtyeight-staff-leaving-disney/ Wed, 26 Apr 2023 00:22:06 +0000 https://www.webpronews.com/?p=523220 Nate Silver, founder of FiveThirtyEight, has said he’s likely leaving Disney amid a round of layoffs impacting his staff.

Nate Silver gained fame for creating FiveThirtyEight, a political analysis website that attained a degree of success predicting political outcomes. The site eventually branched out into sports, using its analytics to predict winners and losers, being acquired by ESPN in 2013. The site was later transferred to ABC.

According to a tweet by Silver, he doesn’t expect to remain at the company once his contract expires this summer.

Layoffs hit FiveThirtyEight particularly hard today, with deputy managing editor Chadwick Matlin, business ops manager Vanessa Diaz, senior audience editor Meena Ganesan, senior science reporter Maggie Koerth, senior designer Emily Scherer, and sports editor Neil Paine all tweeting news of their layoffs.

]]>
523220
Android Apps to Let Users Delete Their Accounts and Data https://www.webpronews.com/android-apps-to-let-users-delete-their-accounts-and-data/ Sun, 09 Apr 2023 21:08:10 +0000 https://www.webpronews.com/?p=522833 Google is implementing a major change in Android, requiring that developers give users a way delete their accounts and data.

While certainly convenient, the plethora of mobile apps many people use can be a major security risk. The more apps a person uses, the more their personal data is scattered across various platforms and services, opening additional attack vectors if those services are compromised.

Google is working to limit the threat with a new data retention policy:

Google Play has launched a number of recent initiatives to help developers build consumer trust by showcasing their apps’ privacy and security practices in a way that is simple and easy to understand. Today, we’re building on this work with a new data deletion policy that aims to empower users with greater clarity and control over their in-app data.

For apps that enable app account creation, developers will soon need to provide an option to initiate account and data deletion from within the app and online. This web requirement, which you will link in your Data safety form, is especially important so that a user can request account and data deletion without having to reinstall an app.

Developers have until December 7 to begin answering questions in their app’s Data Safety section. Developers that need more time can apply for an extension, giving them until May 31, 2024 to comply.

Google’s decision is good news for users and will hopefully give them more control over their data.

]]>
588624