Pages Menu
Categories Menu

Posted in Top Stories

Semiconductor Test – Toward a Data-Driven Future

By Keith Schaub, Vice President, Marketing and Business Development, Applied Research and Technology Group, Advantest America

Integrating new and emerging technologies into Advantest’s offerings is vital to ensuring we are on top of future requirements so that we are continually expanding the value we provide to our customers. Industry 4.0 is changing the way we live and work, as well as how we interact with each other and our environment.

This article will look at some key trends driving this new Industry 4.0 era – how they evolved and where they’re headed. Then, we’ll highlight some use cases that could become part of semiconductor test as it drives towards a data-driven future. 

The past

To understand where we’re headed, we need to understand where we’ve been. In the past, we tested to collect data (and we still do today). We’ve accomplished tremendous things – optimized test-cell automation, gathered and analyzed yield learnings, process drift and statistical information, to name a few. But we ran into limitations.

For instance, we lacked tools necessary to make full use of the data. Data is often siloed, or disconnected. Moreover, it’s not in a useful format, so you can’t take data from one insertion and use it in another insertion. Not having a way to utilize data for multiple insertions reduces its value. Sometimes, we were simply missing high-value data, or collecting and testing the wrong type of data.

The future

Moving forward, we think what we are going to see is, the data that we collect will drive the way that we test. Siloed data systems will start to be connected, so that we can move data quickly and seamlessly from one insertion to another – feeding the data both forward and backward – as we move further forward into Industry 4.0. This will allow us to tie all of the different datasets from the test-chain together, from wafer, from package, from system-level test. All of this data will be very large (terabytes and petabytes), and when we apply artificial intelligence (AI) techniques to the data, we’ll gain new insights and new intelligence that will help guide us as to what and where we should be testing.

We’ll ask new questions we hadn’t thought to ask before, as well as explore long-standing questions. For example, one dilemma we’ve faced for years is how best to optimize the entire test flow, from inception to the end of the test cycle. Should the test be performed earlier? Later? Is the test valuable, or should it come out? Do we need more tests? How much testing do we need to do to achieve the quality metric that we’re shooting for? In the Industry 4.0 era, we’ll start seeing the answers to these questions that resonate throughout the test world.

Data…and more data

Today, thanks to the convergence of data lakes and streams, we have more data available to us than ever before. In the last two years alone, we’ve generated more data than in all human history, and this trend will only increase. According to some estimates, in the next few years, we will be generating 44 exabytes per day. In other words, this would be about 5 billion DVDs worth of data per day. Stacked up, those DVDs would be higher than 36,700 Washington Monuments, and the data they contain would circle the globe in about a week (see Figure 1).

Figure 1. The volume of data we generate will soon reach 44 exabytes, or 5 billion DVDs, per day. Since this amount of data could circle the earth in about seven days, an “earth byte” could equate to a week’s worth of data.

These kinds of numbers are so massive that the term “Big Data” doesn’t really suffice. We need a global image to help visualize just how much data we will be generating on a daily basis. Based on these numbers, we could begin using the term “earth byte” to describe how much data is generated per week. Regardless of what we call it, it’s an unprecedented amount of data, and it is the fuel behind Industry 4.0.

Industry 4.0 pillars

Five key pillars are driving and sustaining the Industry 4.0 era (Figure 2):

  • Big Data – as noted above, we are generating an unprecedented and near-infinite amount of data, half comes from our cell phones and much of the rest from the IoT
  • IoT – sensor-rich and fully connected, the IoT is generating a wealth of data related to monitoring our environment – temperature, humidity, location, etc.
  • 5G – the 5G global wireless infrastructure will enable near-zero-latency access to all of the data being generated
  • Cloud computing – allows us to easily and efficiently store and access all our earth bytes of data
  • AI – we need AI techniques (machine learning, data learning) to analyze in real time these large datasets being sent to the cloud in order to produce high-value, actionable insights

Figure 2. The five key pillars of Industry 4.0 are all interconnected and interdependent.

Because they are all reinforcing and accelerating each other, these Industry 4.0 trends are driving entire industries and new business models, creating an environment and level of activity that’s unprecedented.

Hypothetical use cases

Now that we’ve looked at where the test industry has been and what is driving where we’re headed, let’s examine some theoretical use cases (grounded in reality) that provide a visionary snapshot of ways we may be able to leverage the Industry 4.0 era to heighten and improve the test function and customers’ results. Figure 3 provides a snapshot of these five use cases.

 

Figure 3. Industry 4.0 will enable advancements in many areas of the test business.

1) Understanding customers better – across the supply chain

This use case encompasses various customer-related aspects that Industry 4.0 will enable us to understand and tie together to create new solutions. These include:

    • Customers’ march toward and beyond 5nm and how wafer, package, and system-level testing will work together for them
    • The entire supply chain’s cost challenges, which will help us optimize products and services across the value chain
    • How automotive quality requirements are driving into other business segments – as autonomous vehicles will be connected to everything across 5G, the quality of the connected network and its components will be forced to improve
    • 5G’s advanced technologies, including phased arrays, over-the-air, and millimeter-wave, all of which are already mature in the aerospace and military sectors – we will need to be able to leverage those technologies, cost them down appropriately, and support them for high-volume testing 

2) Decision making – yield prediction
The ability to predict yields will change everything. If you know, based on historical process data, that you’ll experience a yield drop within the next one to two months, you can start additional wafers to offset the drop. This easy fix would enable very little disruption to the supply chain.

If you can solve this problem, however, the next obvious question is, what’s causing it? Why don’t I just fix it before it happens? This involves prescriptive analytics, which will follow predictive analytics. Say you have developed a new generation of a product. You’ve collected yield data at all test insertions for previous generations of the product, which share DNA with the new incarnation. Combining past data with present data creates a model that enables highly accurate predictions about how the wafer will perform as it moves through the supply chain.

3) Creating new customer value – predictive maintenance
This use case is the most likely to come to fruition in the near term. Maintenance contracts require carrying inventory, spare parts and myriad logistics – they represent a huge cost. Soon, by combining tester fleet data with customer data and implementing machine learning, we’ll be able to dramatically improve tester availability, reduce planned maintenance, and decrease losses due to service interruptions. This will allow us to replace modules before they fail.

Predictive maintenance is a proven parameter that’s already being used in other industries such as oil and gas manufacturing. IoT sensor arrays are applied to the huge pipes and pumps controlling flow of chemicals, measuring stress, flow rates, and other parameters. The data from these sensors predict when a pump is going to wear out or a pipe needs to be replaced before it fails. We can leverage, redefine and redeploy this implementation for our use case. Soon, a field service engineer could show up with a replacement module before you even know that you need it.

4) Monetization – using data in new ways to drive our business
Data is an asset, and we’ll start to derive new business on sharing access, or leasing use of our data assets. One example might be a tester digital twin that resides in the cloud. Imagine that customers’ chip model data could be fed into this digital twin as a kind of virtual insertion, and the outputs would be parameters such as performance and yield. Customer benefits would include optimized programs, recommended tests, and predicted test coverage at each virtual insertion. This would enable them to optimize the entire flow depending on the product life cycle – perhaps test order could be changed, or a test added in order to improve quality. Because Advantest owns all the data that comes from our testers, we could lease or sell chipmakers access to the data, creating a significant business opportunity.

5) Automating and improving business operations – driving efficiencies
The test engineering community struggles with finding ways to improve process efficiencies. One way to do this is with the use of intelligent assistants. Still in their infancy, this category of AI can best be described as a trained assistant that could guide you in a helpful way when trying to perform a task.

For example, say we are validating a new 5G RF product on our Wave Scale RF card on the V93000 tester. All the pieces are being brought together – load board, tester, socket, chip, test program – and if there are any problems, the whole thing won’t work, or you’ll get partial functionality. An intelligent assistant or ‘bot’ trained in the necessary skillset can dynamically monitor the inputs and outputs and engineers’ interactions and provide real-time suggestions or recommendations on how to resolve the issues. At first it won’t be smart, but will learn quickly from the volume of data and will improve its recommendations over time.

As you can see, AI’s potential is vast. It will touch all aspects of our lives, but at its core, AI is really just another tool. Just as the computer age revolutionized our lives in the ’80s and ’90s, AI and Big Data will disrupt every industry we can think of – and some we haven’t yet imagined. Those slow to adopt AI as a tool risk being left behind, while those that embrace AI and learn to fully utilize it for their industries will be the future leaders and visionaries of Industry 5.0, whatever that may be.

Did you enjoy this article? Subscribe to GOSEMI AND BEYOND

Read More

Posted in Featured

I Think, Therefore I Am… Machine?

By Judy Davies, Vice President, Global Marketing Communications, Advantest

The ability to think has been a central, defining aspect of humanity since our beginning. Today, technologists are using artificial intelligence to instill that capability into machines. Through statistical models and algorithms, machine learning enables computers to perform specific tasks without receiving explicit instructions from a human. This means that the computer reaches conclusions by accessing available data, identifying patterns and using logical deduction. This does NOT mean AI systems can generate original ideas (at least, not yet). Rather, their intellect stems from their near-instant ability to crunch large volumes of data and then employ their massive memory capacity to compare and search for linkages that yield logical answers.

An emerging area of machine learning is generative adversarial networks (GANs): deep neural network architectures comprising two nets, in which one is pitted against the other in an unsupervised learning environment. For example, one computer might generate a realistic image, and another is then tasked with determining whether or not the image is authentic. By having these two neural nets engage in game-playing to repeatedly fabricate and then detect realistic likenesses, GANs can be used to produce images that a human observer would assess as genuine.

It should come as no surprise that training GANs is challenging. To use an analogy easily understood by the human mind: It’s easier to recognize an M.C. Escher drawing than it is to replicate one. Nevertheless, GANs hold extraordinary potential. Working from motion patterns captured on video, they can create 3D models of a wide range of objects, from industrial product designs to online avatars. They can also be used to digitally age a person’s image, showing how he or she may look a decade or more in the future – which may be useful in helping to identify teenagers or adults who went missing as children. Going a step further, GANs can sort through many terabytes of images culled from security monitors and traffic cameras to perform facial recognition. This can help to actually identify and track the whereabouts of missing kids or wandering Alzheimer’s patients – not mention wanted criminals.

As with most technology, there is a cautionary aspect to GANs. For example, they could potentially be used to generate artificial images for nefarious purposes, such as creating fake photographs or video clips that unsavory types might use to make innocent people appear guilty for political or financial gain. They may also be used to circumvent the CAPTCHA security feature of wavy letters and numbers that many websites use to deter bots from accessing the sites in the guise of human viewers. How to build in safeguards that prevent these types of illicit deployment of GANs is an important consideration.

GANs can be applied to synthesize or fine-tune everything from voice-activated smart electronics to robotic medical procedures. As the technology is further developed and applied, machine learning and GANs are becoming reality. Self-improving AI is increasingly being used to affect the authenticity of what we perceive and think – it’s a literal (human) brain-teaser.


Did you enjoy this article? Subscribe to GOSEMI AND BEYOND

Read More

Posted in Upcoming Events

Advantest Showcases 5G Readiness at SEMICON West 2019

Advantest sponsored, exhibited, and presented at SEMICON West from July 9-11, 2019 at the Moscone Center in San Francisco, California. Advantest’s booth was centrally located in the South Hall alongside other large semiconductor companies and featured the new theme, “5G: Made Real by Our Customers, Made Possible by Advantest.”

In Booth 939, Advantest showcased several products including the new, and “Best of West” award finalist, V93000 Wave Scale Millimeter solution, the industry’s first integrated and modular multi-site millimeter-wave (mmWave) ATE test solution to cost-effectively test 5G-NR mmWave devices up to 70 GHz. Other displays included information on the new Advantest Test Solutions (ATS) for SoC system-level test; the MPT3000 platform for solid state-drives (SSD); T5500-series and T5800-series memory test platforms; and an array of software tools and services to improve overall productivity and test quality. The booth also contained a unique automotive display illustrating how the T2000 series of testers is improving the performance and reliability of broader types of automotive devices, from sensors, processors and powertrains to communication systems.

In addition to having a presence on the show floor, Advantest was a sponsor of the Test Vision Symposium and presented during two of the sessions. Kotaro Hasegawa, system planning senior director, presented a paper titled, “New SiP Packaging Trends and Testing Challenges” during the Packaging and Test session and then Adrian Kwan, senior business development manager, took part in an interactive panel discussion about how 5G has changed the way devices are tested titled, “Addressing Challenges of 5G Test Today and in the Future.”

On the evening of Wednesday, July 10, Advantest customers and industry members gathered for the annual Advantest Customer Hospitality Event hosted at first-time venue Local Edition, a subterranean cocktail bar in the depths of the historic Hearst building. Over 200 attendees networked to the backdrop of live music by classically trained violinist Gabi Holzwarth.

Throughout the conference, Advantest sponsored and participated in the award-winning SEMI High Tech U program, which gives high school students the opportunity to explore the semiconductor industry and develop skills in science, technology, engineering, and mathematics (STEM). Advantest employees led modules on engineering design challenges, critical thinking, and social media; conducted mock interviews; and fielded industry questions during booth tours.


Did you enjoy this article? Subscribe to GOSEMI AND BEYOND

Read More

Posted in Featured Products

New End-to-End Test Solutions for 5G, Automotive and IoT

Advantest’s new MPT3000ARC is the industry’s first test platform to combine thermal-control capability with high throughput, enabling extreme thermal testing of solid-state drives (SSDs).   Adding this new system to the MPT3000 product family, which is already in wide use by SSD manufacturers, Advantest is supporting SSD testing from design to manufacturing, providing the fastest, lowest risk path to market for next-generation devices, including PCIe Gen 4. In addition to meeting automotive thermal test standards, the new tester’s automation-ready thermal chamber enables SSD manufacturers to quickly ramp temperatures, which optimizes Reliability Demonstration Test (RDT) and results in faster time to market.  With the addition of the MPT3000ARC, the MPT3000 series enables rapid changeover to provide a single-system test solution for a wide variety of SSD products, from 40-mm M.2 memories to larger EDSFF devices.

The MPT3000ARC’s unmatched resourcefulness is a key advantage in the continually shifting and developing SSD market, designed to enable mission-critical testing across a broad range of SSD form factors and protocols. The single-system solution allows SSD manufacturers to easily evolve from testing PCIe Gen 3 devices to Gen 4 devices by simply changing a board and downloading firmware. This new tester provides the fastest path to bring PCIe Gen 4 SSDs to market while also minimizing risks, reducing test development time and accelerating new product validation, debugging and production tests.

The continuing growth projected for the solid-state drive (SSD) market requires device manufacturers to find a highly flexible test solution capable of supporting their expanding product portfolios at a low cost of test. Advantest’s new MPT3000ARC s tester is designed with the full spectrum of capabilities to handle all SSDs, including not only the most advanced PCIe Gen 4 memories, but also the highest performing enterprise drives and the most cost-effective client devices used throughout mass-market connected devices, from smart cars to wearable electronics.

With an increasing number of SSDs being used in rugged thermal environments, these memory devices must be proven to withstand harsh conditions. The MPT3000ARC features an innovative thermal chamber that allows it to operate over a broad range of temperatures, satisfying automotive and industrial thermal-testing standards. This makes the tester ideally suited for reliability demonstration testing (RDT) for the rapidly multiplying array of applications.

The MPT3000ARC applies the same proven architecture, software and performance already in wide use by SSD manufacturers worldwide. Its production-compatible ergonomics and automation-friendly chamber access make it suitable for high-volume SSD testing.

By using changeable and customizable interface boards, this tester has the versatility to handle virtually all SSD form factors, from 40-mm M.2 memories to larger EDSFF devices. The system’s design enables quick and easy switching of interface boards, enabling rapid changeover to support a wide variety of SSD products on a single system.

As the newest member of Advantest’s MPT3000 product family, the MPT3000ARC is fully integrated. Its efficiency and performance are optimized by leveraging the same tester-per-DUT architecture, site modules, power supplies and hardware acceleration as all other systems in the MPT3000 series.

View video to learn more.


Did you enjoy this article? Subscribe to GOSEMI AND BEYOND

Read More

Posted in Q&A

Q&A Interview with Dieter Ohnesorge – 5G mmWave Challenges and Solutions

By GO SEMI & Beyond staff

mmWave is the key topic when it comes to frequency ranges that allow to allocate more bandwidth. millimeter-wave (mmWave) is the band of spectrum between 24 GHz and 100 GHz. As it enables allocation of more bandwidth for high-speed wireless communications, mmWave is increasingly viewed as one key to making 5G connectivity a reality. In this issue, Dieter Ohnesorge, product manager, RF solutions for Advantest, discusses the market opportunity and test challenges associated with 5G mmWave, as well as Advantest’s solution for addressing them.

Q. We’ve been hearing about the promise of 5G for a long time. What demand drivers are edging it closer to fruition?

A. If you look at the global ecosystem [Figure 1], there is massive potential for 5G in many vertical markets. For example, 5G will be an essential aspect of smart manufacturing (SM). SM processes provide greater access to real-time data across entire supply chains, allowing manufacturers and suppliers to manage both physical and human resources more efficiently. This will result in less waste and system downtime and will make more technology-based manufacturing jobs available.

Remote access to health services is another key benefit of 5G. First, it would mean less driving, which is much better for the environment as well patients and doctors and staff. Second, if you’ve already had a screening and the doctor has access to it, why not communicate remotely, saving time on both sides? With 5G, you have the benefit of high bandwidth and low latency, which is important for many applications. Autonomous driving, consumer multimedia applications, and remote banking are just a few more of the many areas that will benefit from highly reliable connections, as well as high bandwidth and/or low latency.

Figure 1. A global ecosystem of vertical deployments stand ready to benefit from 5G.

Q. What has prevented 5G from becoming fully implemented?

A. Primarily, the infrastructure requirements. A specification of this scale cannot be implemented on a local basis alone – it takes a concerted, global effort. The worldwide effort to achieve 5G standardization is a huge step forward. In the U.S., discussions about mmWave technology are currently under way, and at the end of the year or early next year, the discussion will expand towards 5G in the <6GHz band.

In 2015, Verizon took it upon themselves to define a proprietary version of 5G as the next step forward from the current 4G LTE standard. At the end of 2018, the 5G NR (New Radio) industry standard developed from the Verizon effort was released, and all new deployments will follow this spec. In the U.S., initially the frequency band is 28 GHz, with carrier bandwidth of two 425-MHz channels and 24 GHz with seven 100 MHz channels. Additional frequency bands will be auctioned by the FCC for 37, 39 and 47 GHz from December 2019 onward. Other mmWave activities can be seen all over the world, although at different pace.

Q. Where does mmWave come into play?

A. Because the portion of the spectrum that mmWave covers is largely unused, mmWave technology can greatly increase the amount of bandwidth available, making it easier to implement 5G networks. Lower frequencies are currently taken up with the current 4G LTE networks, which typically occupy between 800 and 3,000 MHz. Another advantage is that mmWave can transfer data faster due to the wider bandwidth per channel, although over a shorter transfer distance – up to around 250 meters, or just over 800 feet. This means that it could conceivably work as a replacement for fiber or copper wire into homes and businesses, and this “last mile” capability would broaden the reach of 5G to cover both small and very large areas.

Q. What are the challenges around mmWave test that spurred Advantest to develop a solution? Which does it address?

A. Advantest’s Wave Scale RF card for the V93000 tester platform has seen great success. Its operational range is 10 Mhz to 6 GHz, so we needed a solution that can address the frequency and power requirements associated with higher-bandwidth devices.

Frequency is one of the key parameters associated with mmWave, and with that comes power-level measurement, EVM [error vector magnitude], ACLR [adjacent-channel leakage ratio], and other aspects that all need to be addressed in the testing process to ensure they meet specifications at the wider bandwidths required by 5G-NR.

Another requirement is the number of ports – with 5G mmWave’s beamforming capability, testing could easily be in the range of as many as 32 to 64 ports. At the same time, due to the frequency nature of mmWave, with 5x to 7x frequency, the cost goes up as well. That’s also been one of the challenges: holding down the cost of test with a wide number of sites being tested in parallel.

The V93000 Wave Scale Millimeter test solution, which we introduced in May 2019, extends the capabilities of Wave Scale RF. It is designed for multi-band mmWave frequencies, offering high multi-site parallelism and versatility. It has two operational ranges: 24 GHz to 44 GHz for 5G mmWave, and 57 GHz to 70 GHz, which extends the product’s capabilities for the wireless Gigabit, or WiGig, era. Figure 2 shows the range of frequencies that Wave Scale was developed to cover.

Figure 2. Wave Scale RF provides a scalable platform for connectivity device test, from standard RF to millimeter-wave.

In addition, new modules can be added as new frequency bands are rolled out worldwide. The card cage has up to eight mmWave instruments, making it versatile, cost-effective, and able to perform as well as high-end bench instruments. Because it has wideband testing functionality, Wave Scale can handle full-rate modulation and de-modulation for ultra-wideband (UWB), 5G-NR mmWave up to 1 GHz, and WiGig up to 2 GHz, supporting probes as well as antenna-in-package (AiP) devices connectorized, and over-the-air testing.

Figure 3 illustrates 5G device measurements that can be achieved using Wave Scale Millimeter: power out/flatness test results. The solution’s massive parallelism allows these tests to be performed quickly and at significant cost savings.

Figure 3. This graph overlays a customer’s 8-channel transceiver power-out test results, performed over 800 MHz at 28 GHz. Wave Scale allows channel flatness to be executed in a single operating sequence, one channel after the other.

Q. When will this solution be widely needed?

A. The Industry is still learning how to test these devices. We can help customers get started now, thanks to the modularity of the solution. They can start below 6GHz and when they need the higher frequency, we can add the mmWave capability.

The bottom line is that Advantest’s platform approach is ideal for this scenario – because it is scalable and modular, we can continue to add to the product’s functionality to make it even more comprehensive. By being ahead of curve, we will have the right solution ready when our customers need to adapt to new requirements.


Did you enjoy this article? Subscribe to GOSEMI AND BEYOND

Read More