Triple
T1180412
| Position | Surface form | Disambiguated ID | Type / Status |
|---|---|---|---|
| Subject | deep feedforward networks |
E25122
|
entity |
| Predicate | canUseActivationFunction |
P9928
|
FINISHED |
| Object |
ReLU
ReLU (Rectified Linear Unit) is a widely used activation function in neural networks that outputs zero for negative inputs and the input value itself for positive inputs, enabling efficient and stable training of deep models.
|
E134578
|
NE FINISHED |
Provenance (6 batches)
| Stage | Batch ID | Job type | Status |
|---|---|---|---|
| creating | batch_69a494267b4c819088c97a59182bf56a |
elicitation | completed |
| NER | batch_69a4bd32c5f48190b4e2d39fa052cbb7 |
ner | completed |
| NED1 | batch_69ac6f1f1c188190a96f5718c4e7d59d |
ned_source_triple | completed |
| NED2 | batch_69ac7009b52c8190a6f6962cf60de92d |
ned_description | completed |
| NEDg | batch_69ac6f9c0da48190b8d5615ba582366c |
nedg | completed |
| PD | batch_69a4bb59ca6c81908597a81646674aaa |
pd | completed |
Created at: March 1, 2026, 7:45 p.m.