溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊(cè)×
其他方式登錄
點(diǎn)擊 登錄注冊(cè) 即表示同意《億速云用戶服務(wù)條款》

003JAVA多線程同步與異步方法

發(fā)布時(shí)間:2020-04-03 09:51:20 來(lái)源:網(wǎng)絡(luò) 閱讀:389 作者:zjy1002261870 欄目:編程語(yǔ)言

package com.skcc.mthread;

public class MyThread002 {

public MyThread002() {
    // TODO Auto-generated constructor stub
}

/*****
 * synchronized void work()  同步方法
 * void eat() 異步方法
 * ***/
public synchronized void work()  {
    System.out.println(Thread.currentThread().getName() + " synchronized work method executed.");
    try {
        Thread.sleep(5);
    } catch (InterruptedException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
}

public void eat() {
    System.out.println(Thread.currentThread().getName() + " asynchronized eat method executed.");
}

public static void main(String[] args) {
    // TODO Auto-generated method stub

    MyThread002 m1 = new MyThread002();

    new Thread(new Runnable() {

        @Override
        public void run() {
            // TODO Auto-generated method stub
            m1.work();
        }
    },"t1").start();

    new Thread(new Runnable() {

                @Override
                public void run() {
                    // TODO Auto-generated method stub
                    //m1.work();
                    m1.eat();
                }
            },"t2").start();
    new Thread(new Runnable() {

        @Override
        public void run() {
            // TODO Auto-generated method stub
            m1.work();
        }
    },"t3").start();
    new Thread(new Runnable() {

                @Override
                public void run() {
                    // TODO Auto-generated method stub
                    //m1.work();
                    m1.eat();
                }
            },"t4").start();
}

}

向AI問(wèn)一下細(xì)節(jié)

免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長(zhǎng)郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。

AI