Implement an efficient attention mechanism using Einsum operations. Your task is to complete the function `efficient_attention` that takes in three tensors: queries, keys, and values, and returns the output of the attention mechanism with einsum operations.
{
"input": "queries = torch.randn(2, 3, 4), keys = torch.randn(2, 3, 4), values = torch.randn(2, 3, 4)",
"output": "torch.Size([2, 3, 4])"
}
{
"input": "queries = torch.randn(5, 10, 6), keys = torch.randn(5, 10, 6), values = torch.randn(5, 10, 6)",
"output": "torch.Size([5, 10, 6])"
}
use python data or natural language description